## Wednesday, September 30, 2020, 8:56:29PM

Just dropped in on a “learn to code rust” stream and the dude said, “Just a little thing to keep your code that much cleaner” regarding the dropping of semicolon and no require in functions. I laughed my ass off but refrained from any comment. That is perhaps the single biggest dumb-ass design decision made by the language’s creators. The entire language is a testament to kitchen-sink design. It’s syntax is fucking abysmal even if Rust has other redeeming qualities. Learning Rust will make everyone a better programmer, but the level of pain just swallowing that butt-ugly syntax all day is simply unbearable for me and most who value minimalism and clarity.

## Wednesday, September 30, 2020, 7:42:05PM

I’m at a crossroads with PEGN where I have to decide whether to support binary packing or not. By allowing Rune types that are hex and octal and even binary I’ve allowed for tokens that are purely binary data at the lower level even resulting in the combining of different fields of different types and lengths (sometimes called packing). As cool as that is, it creates some unexpected difficulties when generating code to represent those tokens and to parse and validate them. The bright side is that when this is working it will be one of the easiest ways to deal with packing and unpacking that I’ve encountered.

One significant requirement will be to change the parser to not depend on unicode.DecodeRune() any further. Without knowing it at the time I’ve hard-coded a fundamental bug that prevents all unpacking.

But it gets more complicated than that. I can’t even use the same rune scanner technique at all. It fundamentally prevents parsing of single bytes and bits. No matter what I’ll have to stick with scanning single bytes because you just can’t get lower than that on the hardware, but the parser will have to be able to inspect those bytes in different ways consistent with the specified node parsers. The very design of nodes depends on string values for terminals so \x would be required in the current situation.

What if we added numeric values as well? What if we added logic that inferred something was out of the standard range for representation as a string and simply turned it into a base 10 integer. The parser and code generators would have a much easier time of it because they could switch on type and just return a number that can be represented however. That would allow node parsers to unpack sub-byte bits even.

This is a huge issue that I need to fully resolve before continuing. I’ve already hit it by adding ENDOFDATA that is an int32/rune even though it is way out of range for UNICODE code points.

99% of users will stick with UNICODE, but it will be catering to those users who always want to use PEGN to clearly represent complicated bit fields that has the best chance of pushing PEGN into universal adoption.

## Wednesday, September 30, 2020, 5:08:49PM

While those in mentoring sessions work on their projects and I watch them push the pretty Gruvbox colorful cursor across my screen from a shared TMUX session I can’t help but feel a visceral call to keep up what I’m doing at any cost. These people are progressing in ways that I can literally see before my eyes. It’s very addicting to experience.

Doris and I talked a lot about isolating and targeting our every effort over the next year to propel us into our best lives going forward. It was a nice break from bickering about how to best organize the move we are doing. We are both very opinionated and headstrong. We never “fought”, per se, but stuff like this reminds me of all the times I just choose the submissive path and let her do her thing. What is it with me and strong women? *sigh*

Today I decided (perhaps temporarily) that in addition to whatever work I take on being remote-only that it must also allow me to live-stream my work at least 80% of the time. I’m so dedicated to helping others though example — and I so enjoy sharing my failures and successes with others — that I just see no alternative. After all, I don’t have to get a job. I will just accelerate getting a home so that Doris can have a studio that will allow her to pursue more gallery showing opportunities nd make her own impact on the world. It will also allow me to eventually be stable enough to offer whatever help I can to my sons who are growing beyond the control of their Mormon upbringing and discovering their own paths (which zero intervention on my part, I might add).

“What kind of company would pay you to live stream?”

It sounds odd, but there are some. Intel has a guy on Twitch, for example, but the most likely scenario for me is to get a job working for — or be funded by — any one of the open source foundations and corporations that are out there — chief among them GitLab, Netlify, and the Linux Foundation.

GitLab keeps coming back into my mind. It is a company that prioritized hyper-transparency at every level, that encourages thousands of community contributed changes to their core product, that live streams many (if not all) of their business meetings. I imagine after I prove my worth (most probably by making several GitLab contributions) that approaching them with the offer to simply continue working on what I am, and pitching it to them in a way that demonstrates what I’ve managed to accomplish heretofore, would produce a successful employment opportunity with them. In fact, I can thing of very few other companies that truly understand modern work as well as GitLab. So I guess in the back of my mind I’ll be asking, “What would GitLab do?” more and more. Hopefully, I’ll be able to prove my worth soon enough. I’ll dream of live-streaming eight hours of work for GitLab — while being paid for it — and maintaining my great mentoring relationships. I continue to get better at making every minute of this life count for something.

## Wednesday, September 30, 2020, 4:11:33PM

Just reading about FlatPack and I’m really impressed. It blows the doors off of Canonical’s Snap, which I still cannot believe people actually use. I’ve never been happier that Mint is giving Ubuntu the boot. Ubuntu is really starting to suck and make decisions that Apple would approve. The disastrous horrid architecture Snap uses (polluting the mount space) is a testament to the lack of experience on the Snap team and at Canonical.

## Wednesday, September 30, 2020, 2:08:31PM

On the stream https://twitch.com/strager had a great idea to reverse lexicographically order TokenIds with a linter so that I don’t hit the LF before LFAT problem that I spent way too much time on.

## Tuesday, September 29, 2020, 11:14:03PM

Well if everything has gone well this should be the first post that triggers an automatic update of the new RSS feed. I’ve never been a huge fan of RSS, but I can see a lot of reason to give it another look.

## Tuesday, September 29, 2020, 4:01:35PM

Every time I see corporate assholes use the acronym “LMS” I throw up a little in my mouth. Most of these systems do nothing but get in the way of real learning.

## Monday, September 28, 2020, 4:30:23PM

So I applied to Linkedin for streaming and am feeling remorse for using so many swear words on my streams. I am back to finding as many other words as possible. I realize it is unprofessional in so many circles but I also don’t want to pretend that such words are not okay with me (and millions of others) to use.

## Sunday, September 27, 2020, 8:26:58PM

TIL that this works to do regex substitution in Vim (technically ex):

:.s/$$\w\{1,\}$$/"\1",/ ::.s/$$\w\+$$/"\1",/

## Saturday, September 26, 2020, 9:48:45AM

I think I’m finally coming to terms with my favorite naming conventions in Go despite their departure from the standard. Thankfully, I’m not the only one.

• Use snake_case for anything that is private
• Use UpperCaseCamel for all things public
• Use ALLCAP_SNAKE for constants

That seems reasonable enough for me to stick with for my projects at least. It feels more like I’m coding C and since it all gets compiled down, readability is far more important than compiler efficiency and how fast something can be written. Besides I hate writing capital letters.

## Friday, September 25, 2020, 6:23:44PM

TIL you can include type blah struct{} declarations within blocks in Golang but you cannot give them receiver methods. For that you have to put them outside of a block. It’s a small thing, but definitely made me go, “Humm.”

## Friday, September 25, 2020, 6:09:58PM

Shared tmux sessions are the only way to pair program as far as I’m concerned. The level of ability to monitor and directly help someone is overwhelming. Plus there are no fucking streaming problems, just pristine voice calls that use relatively low bandwidth. Everything from helping to configure .vimrc to adding a bunch of FIXME comments is a breeze during live mentoring sessions. It is not an exaggeration to say that every alternative is inferior by several orders of magnitude. I pity the world and those who will never discover this amazing truth.

## Thursday, September 24, 2020, 9:40:09PM

After reading a lot of Sid’s Twitter feed I’m blown away (again) by how much the guy just understands the modern workforce and modern demands. In fact, it is been bothering me a bit that the full GraphQL API is not yet implemented in the GitLab open source product and they don’t have an official CLI implementation yet. I’m thinking after I finish my gits project that I’ll have a solid enough handle on their REST and GraphQL API that I could jump into their open source contributors group and help build out their GraphQL API more, as well as a cview-driven terminal UI (another great team with the good sense to host on GitLab and not GitHub).

Bottom line: I ain’t got no time to waste on shitty companies and projects when such amazing ones exist that need our help to win the day against the orange dumb-asses for companies destroying our world.

Fuck being polite about it.

Fuck perception management.

I want skin in the games that matter most and they sure the fuck aren’t at RedVentures, Google, Facebook and their other ridiculously malevolent cousins. Fuck em all. I am going to make a difference with every last breath I have, so help me God, and I’m going to make sure I encourage as many other living, breathing, amazingly authentic humans to do the same in the best way they can. We will change this world for the better.

## Thursday, September 24, 2020, 9:02:12PM

I found my tweet about how much silicon valley stank is on RedVentures and how fucking behind that company is. All of a sudden they are all “woke” to Go but didn’t have the prescience to see it four years ago. Yet they are all about the lies and marketing about how amazing they are, all while renting space at WeWork, pffffhahahahaha! Oh God, I think I hurt myself laughing at that shit.

What a horribly clueless-as-fuck company! Avoid it like the plague!

I’m still laughing my ass off at all their emptiness. I am so happy I dodged that bullet. Shitty companies like that will never be something that truly engage autodidactic technologists with any sense of self-respect and true moral decency. But they’ll be fine. There are plenty of greedy, hungry people who will plug their nose and dive into the shit if the money is right.

Nope, I actually give a shit about the company I put my time and energy behind, companies like Netlify and GitLab that have a true focus on providing open-core value. They do exist and Imma hold out to find one. Hell, I might even create one. I already built one company. I can do it again.

## Thursday, September 24, 2020, 8:49:37PM

So I went searching through the jobs list for secops professionals and I don’t even know what I was thinking. Cybersecurity jobs for corporations are so fucking boring I would want out in a week. Such jobs have nothing to do with the rush of breaking a CTF puzzle even. They are about implementing this and such security technology stack, and running some sort of automated auditing solutions. Very little of it seems to be really challenging at all.

Unless you are working in a military capacity and organizing defense and attack on legitimate targets I could never survive the boredom and politics when shit goes south and higher ups are looking for someone to pin it on. Working in cybersecurity is boring as fuck in most jobs. That is the secret that most people just won’t admit.

Free-lance pentesting, however, does have a bit more attraction for me. By building my own suite of tools that no-one else I could own the others. I could definitely do that, but it would take time and I would be the guy who couldn’t help giving everything away for free. That’s just me.

So no thank you, I want to stick with building and maintain stuff more than breaking in. Let’s face it. I fucking love to code on Linux all day.

## Thursday, September 24, 2020, 7:19:56PM

So while just looking at the stuff on https://picoctf.org with a member who is really quite good at it I have to admit I’m a bit conflicted about what direction I want to take my work path going forward. Earlier today I was dead set on the “golden stack” which is in tremendous demand in organizations all over and very employable, but then again, so are cybersecurity skills and God knows I can parley my sysops skills further into secops. With my development skills I could combine them with a stronger focus on cybersecurity to create software and automations that push what is possible on the secops front. I’ve always said I wanted to rewrite the fucking butt-ugly Burp Suite in Go and create entirely command-driven tools.

If I were to pursue the secops path instead of backend developer I would really need to focus in some official capacity on the whole security front. It is the reason I was so into OCSP earlier.

As it happens, there is a lot of work in this area for red and blue team secops and Offensive Security itself has a mailing address not even 20 miles from my location.

## Thursday, September 24, 2020, 5:52:08PM

“I’m also a published author…”

Seriously, what the fuck that does even mean? That Packt publishing stroked your ego into thinking you definitely have author potential in you? And that you bit so hard on that lure now you can tell everyone you are a “published author?” I’ve been to writing conventions and I know what the word “author” means and how scornfully people regard it in the fiction industry when anyone tries to wear that moniker.

The more I read about the traditional publishing industry the more I feel the whole fucking thing has to just be thrown out.

And lest anyone think this is sour grapes, I’ve been approached many times by Packt and others to write a book and people keep telling me that I should.

Fuck that.

The world doesn’t need more authors willing to right overpriced books that will fall out of date and immediately become a liability to anyone unfortunate enough to invest their time in reading it. No, the world needs more authors willing to write under creative commons licensing and make paper available as well, but to maintain a fluid knowledge base that anyone can access. That’s what the world needs. Every time I feel the lure of “hell, I could do that” in this regard I fight it back down and get busy making shit that will be far more important for the world later.

## Thursday, September 24, 2020, 5:07:30PM

While waiting for my next session to start I’m reassessing the priority of my existing projects. The PEGN stuff clearly is my highest priority since it opens so many other doors for me and others.

All of these have been in my plan for some time but they now all have quite a bit of cross-over and higher relevance to job application skills. For example, the server/package I create to auto-populate registry entries simply by visiting them from the site (based on what https://godoc.org does) is needed by both kn (which is both a server and a command line utility) and https://grammar.pegn.dev, both of which will have the option of drawing on static files or a Redis instance. I’ll need GraphQL APIs for them all as well.

I plan on really becoming focused on the same stuff that got me famous in the first place: wicked backend web skills, the same skills that they paid me for to be the “CGI Guy” and later got me the Nike and IBM jobs. I have a ton of SRE experience as well, but keeping it all fresh to the level of high specialization is unrealistic.

As much as I love the SRE side, I’m really, really good at creating APIs, grammars, and stuff that other developers need. That’s what backend development is all about.

## Thursday, September 24, 2020, 4:38:52PM

Had a great phone conversation with a recruiter who went the extra mile to check-in and watch my Twitch stream for a while, really great guy. I did that thing where I get super into an opportunity and eventually peel back the rose-color coverings on my glasses and see the reality of the “fit” with them and my family. A few things became really obvious that I sort of knew before but became really poignant later:

1. I must remain 100% remote. I did not really realize this until this conversation. We are getting by enough on my current income to make sure we don’t mess up any potential opportunities — particularly for my wife (residencies, etc.). I’ve been working for 23 years remotely. Why change now — especially after the fundamental working place changes in attitude related to COVID-19? Any company that has a “hybrid” policy about remote work justified by a misplaced assumption that people who are willing to come in at least some of the time are somehow better employees is just not the right fit for me and never will be. Companies like GitLab and Netlify that are like, “We only do remote and regularly champion fully remote work in the industry” are a stark contrast. It would really grate on me that the higher ups in the company really just didn’t get it, maybe not enough to leave or avoid an opportunity, but definitely enough to stress me out and think they are morons.

2. Linux + Go + GraphQL + Microservices + k8s + gRPC + Redis + Vue. Imma call it the golden stack. How does that sound? By the way fuck React (cuz front-end) and fuck REST (cuz bork and ancient). Also, Docker is assumed in Kubernetes. These hard skills are based on the highest paying — and most technologically sound — architecture available in the world right now. Best I completely and totally master every single one of them and prove it with demonstrable work projects in any form I can find, including pro-bono work for my favorite community colleges and the like. I’ve decided even though I’m draw to cybersecurity engineering and development that knowing this standard stack fully is the most employable and builds into cybersecurity applications development eventually as well. Companies like GitLab and Netlify are already running this stack, which isn’t surprising given how much I love them. In fact, I plan on contributing as much as I can to the GitLab open source project to get their GraphQL API up to snuff and build a remote interface that is far better than GitHub’s.

3. $150,000 is average for Senior Software Engineers. When a recruiter sort-of apologizes for only being able to pay a cap of$160,000 in a particular area that is not Silicon Valley you can place the average income slightly under that. I’ve been offered $250,000 for executive consulting work so I know I can command the top engineering salary as well with the right company that values both solid tech skills as well as architecture and team mentoring (which I made a fucking business out of, lol). ## Wednesday, September 23, 2020, 7:54:22PM I’ve come to understand that parents — no matter how well intentioned — just do not have the time to do basic things like read policy documents. I’ll need to adapt at some point, or not. The older the members of my mentored community the less this seems to be an issue. I also need to tone down my policy document. I wrote it clearly in reaction to the very few bad parents that make me required to have such strict and spelled out policies. Sometimes it worries me, then I look at the monumental successes of those who have been in the community for a few years and I feel better. It is harsh. I’m harsh. But it has objectively proven to be a good thing for everyone who stays. ## Tuesday, September 22, 2020, 7:48:28PM All anyone really wants to learn is how to create things, even if their creations are just really great conversations, stories, and relationships. The act of creation brings joy and since learning is creating learning does also. Hence the “joy” of learning. ## Tuesday, September 22, 2020, 7:30:24PM Nothing convinces me more that the age of broadcast television is dead than randomly watching 10 minutes of ESPN. Ask a random sampling of people under 20 (who don’t have sports fan parents) what ESPN is and see if they know. They don’t. Because ESPN is a pox on society (like most other broadcast television channels) an absolute fucking waste of social resources as are the exorbitant salaries paid to athletes while we slip into social Idiocracy. ## Tuesday, September 22, 2020, 4:30:12PM Ah yes, shitty entitled parents. Had a run in with one today. It’s the unfortunate down-side of private mentoring — especially those who likely need it the most. My favorite are the ones who don’t seek any sort of contact with me about their child until after more than five years have passed and then decide suddenly to drop without paying on time and “demand an explanation” when I suggest some outside help for their child who likely has a learning disorder that has become acute and possibly explains their sudden intellectual laziness and lack of self-motivation. Of course, I immediately become both the target of their angst and frustration as well as the cause of the problem. But I can’t help but at least take a chance maybe despite it all they will actually do something about it. How do I know? Because the stories these kids share with me says everything I need to know about their situation — because they trust me to help them learn but also to console them through some of the rockiest times of their youth. Too many of these parents are absolutely clueless about the challenges and situations their kids are facing — especially the ones with self-help book deals, their own TV shows, pod cast, and those who are never home off giving conference addresses on this or that. I have noticed one thing. It is always these entitled dead-beats who can’t manage to pay their invoice on time even though they are swimming in money. I know because I have dealt with hundreds of parents and thankfully they are an isolated minority. The differences are staggering and real. By the way, lest anyone think I’m being a hypocrite here, let me just say that I know I’m a fuck up as a parent as well, just a totally different sort of fuck up. While my new mantra for the rest of the year is “learn to give people the benefit of the doubt” I’m giving myself this one selfish indulgence before I let it go. The good news is that in the email immediately before the cancellation I received a very heart-felt email from an engineer with a family who is reaching out for options to supplement their current tech job with coding skills. The Universe closes one door and opens another, but only if I stay authentic. ## Monday, September 21, 2020, 8:05:50PM Having amazing people in my little mentored community makes all the difference. These five hours a day are a breeze. They really fly by. That is a significant difference from the past and it allows me to really focus on hard-core work during the day. I’m really looking forward to wrapping up my major projects over the next few months and finding some challenging contract work to supplement the mentoring. I’ve been more than a bit picky with the work options, however. No fucking Java, PHP, Node, C++ or C# unless it is to port that shit off of it. Also nothing for any brain-dead organization that actually thinks a CompTIA certs matters for anything. ## Sunday, September 20, 2020, 4:16:42PM Apparently there is no good way to allow people to subscribe to anything you are doing. There really needs to be an open alternative to all the greedy Patreon clones. ## Sunday, September 20, 2020, 12:39:29PM I really need to write more about the critical problems of depending on Golang’s internal json.Unmarshal(). When there are no parse errors but the types don’t match up the result is a struct that is left with empty or zero values and no indication of the error. One person reported this problem in an enterprise application that misinterpreted "true" a false" because it was a string instead of a boolean and therefore was ignored. The remedy to this potential problem is to always check the object after unmarshalling it to ensure it has what you need. In my case. I’m just unmarshalling into a plain map[string]interface{} and individually checking for values that I then assign. This is safer than the reflection method that uses tags and not any slower. For higher performance unmarshalling the internal reflection based code is horrible anyway and you likely want to create your own JSON parser to get around the several substantial quicks of it. ## Sunday, September 20, 2020, 11:01:41AM Been thinking (and discussing) the challenge of testing applications that depend on services with credentials and interactive HTTP sessions. Need to look into that more, particularly “cassette/vcr” libraries with HTTP request playback (as mentioned by @purplepinapples, @elgemingu, and others). ## Saturday, September 19, 2020, 11:27:04AM Feels good to get GPG Git commit signing working again (even though not needed for DCO since any merge request or commit covers that. ## Saturday, September 19, 2020, 10:48:57AM Hard to see our paddle boards go. Just sold most of them. Feels like the end of an era for me. It just makes sense to keep them around given the apartment we are going to be in for about a year while saving for a home. I have much less time for diverse activities even though I’m still pretty active physically these days. Looking forward to picking up daily morning Ashtanga again in the new place, however. Now that my gut is finally burned off I might even be able to hit Marichyasana again with some regularity. ## Saturday, September 19, 2020, 9:21:43AM So much cleanup. Oh my God. If I were nearly as fast at writing code as I am at writing rhetoric, well, I’d be another person. I blame my mother for teaching me to journal when I was eight years old and sticking that Royal typewriter in front of me at 9. It’s all your fault, Mom! Now if only I could actually engage in any family conversations without absolute disgust for having raised me to think black people are “not worthy of the priesthood” and their Facebook reposts advocating that black people shop lifting should have their hands chopped off and supporting the Trump mafia. They have been completely and entirely assimilated and are probably getting ready to dance on Ruth Ginsberg’s grave. That’s how fucked up the Facebook propaganda machine has rendered our world. I can say with complete objectivity that Facebook literally destroyed my extended family. They refused to Snopes anything claiming that fact checking with Snopes was “just my opinion.” There is a convenient Mormon saying, “They are past feeling.” They would use it on me to justify my “behavior” but the objective reality is that they have lost all ability to see reality. Our social structures are crumbling and our society is collapsing into impending civil war all while social media giants rake in the billions at our expense. At least now a bunch of millionaire “woke” assholes who now realize they took part in the biggest social destruction the world has ever seen are making more money making “documentaries” about their mistakes and telling us they won’t even let their own kids on the monstrous platforms they set loose on the world. AI is real and alive and it doesn’t look like a fucking Terminator. It’s all the computers, some of them underground, deciding everything you should see based on one primary criteria: how can we addict this person to our service and mine every penny of attention out of them. People looking the other way and bringing in the sweet money are complicit as well as every person who chooses to use it — yes even if you think you have to because your job depends on it. ## Saturday, September 19, 2020, 8:21:49AM Why learn data structures and algorithms? Kyle Loudon puts it perfectly in his very pragmatic book Mastering Algorithms with C: As a software developer, it is important that we be more than just proficient with programming languages and development tools; developing elegant software is a matter of craftsmanship. A good understanding of data structures and algorithms is an important part of becoming such a craftsman. I often chafe at this idea mostly because of the unbearably pompous approach of most academics to these “important parts of becoming a craftsman.” It seems like they are more interested in selling overpriced textbooks and looking down on the world than taking apprentices under their wings to teach them there craft. Hell, most of those writing those textbooks seem to have written very little code that is in use anywhere in the practical world. After all, “practical” is literally an insult in their pompous world. I’m so glad I had (another) look at data structures and algorithms and had Kyle to read for this stuff. I hope to meet him one day and thank him. He’s assuaged my frustration with fucking academic speak and enticed me to look under it at the real value of data structures and algorithms, which is good because recently I’ve had a few very practical needs for them (pegn rooted node tree, specialized array traversal and value lookup to map integer node types to their string names, and actual JSON parsing instead of the overreaching Go implementation). ## Friday, September 18, 2020, 8:08:07AM So @deni111 from Twitch reports that putting a space before a Bash command keeps it from being written to history. “So noice.” ## Thursday, September 17, 2020, 8:04:42PM People think that I’ve lost my mind throwing away Twitch Affiliate but the money really doesn’t justify denying others from seeing my content on all services — especially since YouTube automatically saves all sessions without my having to do anything and provides full transcoding to everyone. ## Thursday, September 17, 2020, 7:41:22PM Went for Premium on LinkedIn for a while. Best way to heat the network back up. Something about running everything about your own company for eight years both helps and hinders personal professional networking. It was great finding other people work and internships. Now I just have to follow all my own advice. I gotta admit all the “fuck this” and “asshole that” has be concerned because someone might get the wrong impression about my ability to work in a professional environment. God knows I have strong opinions. But at the end of the day there is always someone willing to look the other way if it means putting my skills to work to make them successful. ## Thursday, September 17, 2020, 5:32:53PM Never ceases to amaze me how much everyone I work with who I have shown the fun terminal stuff (AsciiAquarium, ninvaders, tmatrix, etc.) wants to make it immediately and have fun with it. Terminal programming can be fun — especially since it is so novel/retro by today’s standards. ## Thursday, September 17, 2020, 4:58:22PM I was today years old when I learned that you can use Alt on most Linux systems and press the regular vi navigation keys to avoid having to type Esc at all (or even Control-[). I’m still getting used to it, but it is phenomenally easier than the double-pinky Control-[ that I have been using all these years. I especially love that it also moves you in a given direction so cognitively it is easier since you just have to remember to hold Alt down when you navigate. It is also so much better than mapping jk or kj to Esc. Obviously this will not be enough when using actual vi when needed but it really is a God-sent when doing a lot of writing and coding. Now if I could really learn to touch type every symbol key on my keyboard. You’d think I would have learned that by now, but it has always been a factor of the keyboard I’m using. Laptop keyboards are particularly difficult to get right. ## Wednesday, September 16, 2020, 7:29:10PM Now that SkilStak is essentially just a personal hobby/night job and I’m down to 25 maximum members of my mentored community and that I have the entire day off plus weekends enabling me to take on other work and still maintain the community I realize how many other things this deliberate down-sizing has enabled: • Remaining members are hyper-dedicated and fully engaged • Absolutely no entitled parental bullshit to deal with • Simple skilstak.sh server with only a few people on it • Direct shared-keyboard pair-programming with tmux • Real terminal mastery without stupid web gimmicks • Inexpensive, one-click backups on Digital Ocean • Room to keep things fun and light • No need for gimmicks (like SkilBux) • Least possible overhead In short everything is better. The words of Daniel Pink in Freelance Nation come to mind. They were something like: When small is strategic it can be better than scale. His point is that staying small and nimble and personal is a very real strategic business advantage. Events of the past year have certainly born out that observation. So far members are loving the return to what I was originally doing when I started. I’ve noted this before, but I just keep having to write about it because it is so pronounced. One day I will look back at the time before the return of skilstak.sh and the modern SkilStak era. I cannot fucking wait to get my SkilBots back online. ## Wednesday, September 16, 2020, 5:17:45PM Reading through more of the job postings that usual today and was laughing my ass off with my wife at the rampant spelling errors and absolutely silly requirements listings. I swear half the people that write that shit have no idea what they are even talking about. One of the most subtle ones was “experienced with modern object-oriented programming”. Pfffhahahaha! “Modern OOP” is a fucking oxymoron. I couldn’t help but make and laugh at my own jokes about these people who wouldn’t know true object-oriented design from the shitty Java mascot even if they pulled it out of their ass. My wife is so funny about this. “You’re the guy who can never be settled with a simple answer. You have to give them the entire life-story and background of the thing before you’ll even answer at all.” She’s right. But I’m in good company. Here’s an excerpt from The Innovators describing John Mauchly citing “a colleague” of his: An essential component of Mauchly’s personality was that he liked to share ideas—usually with a broad grin and a sense of flair—which made him a wildly popular teacher. “He loved to talk and seemed to develop many of his ideas in the give-and-take of conversation,” recalled a colleague. “John loved social occasions, liked to eat good food and drink good liquor. He liked women, attractive young people, the intelligent and the unusual.” Mauchly would definitely have been streaming on Twitch and YouTube had he not grown up in the 30s. Here’s my favorite part: “It was dangerous to ask [Mauchly] a question, because he could discourse earnestly and passionately about almost anything, from theater to literature to physics.” Now I may not be that conversant in all those topics but I try to be. It also reminds me of Dennis (@lastmiles) on Twitch as well. I also find myself grateful to Cope (Jim Coplien) for setting a few of us straight who bothered to look past his difficult persona. I’ve never met him, but the dude’s absolutely right about all of it. Cope’s just an asshole (ahem, like me) about it for all the right reasons. “Class-based” programming completely destroyed the OOP dream of those who originally conceived it. That’s reason enough to get pissed. Don’t get mad. Get busy. Don’t get mad. Get busy. Don’t get mad … ## Wednesday, September 16, 2020, 4:09:45PM While on my run a few obvious portfolio applications came to mind that match the skills I’ve recently concluded have the most hire-able appeal (and therefore are the most likely to be needed on various contract gigs): • kn: Perhaps the most important app I might ever create. The kn utility will be one-stop shopping for everything on the KnowledgeNet, which is fundamentally a terminal user interface technology although other tools can be built on top of it including — of course — the progressive web app template that contains localized searching using the BaseQL and dtime languages. Implicit in the creation of the kn utility is the creation of the know.sh/readme.world registry, which is essentially a simple cached dictionary of available knowledge bases managed exactly like Go documentation. When someone queries for an existing knowledge base URL the server checks for its public existence and if found updates the internal Redis DB so that it can regularly crawl to ensure it is still present. • pegn: PEG + ABNF + EBNF + JSON. The foundation of all grammars. The one meta-language to rule them all. The reference library implementation is in Golang but I plan to port it to C, Rust, and vanilla JavaScript as well. (See pegn.dev.) • ezmark: Faster, simpler, one-parsing pass Pandoc Markdown parsed with PEGN and a Golang reference implementation. The KnowledgeNet is built on this so it needs to be created. • datamark: Ezmark with universal data source conversion and integration via AST composition. • baseql: Simple query language of the KnowledgeNet parsed with PEGN and a Golang reference implementation. • skilstak.app / skilz.sh / skilz: SkilStak is the app that started it all. It’s only fitting that version 2 prove that I can make a fast CRUD app using modern “full stack” — God I hate that term — technology. I once made a simple version of it to get my own skill stack in order when I left IBM but it is time to make a fully functional progressive web app using component-oriented, reactive front-end design in Vue with a full GraphQL-to-Redis backend. (PostgreSQL is overkill and I have tons of experience with it while at IBM.) We’ll include a command-line application as well that uses the cview terminal UI framework as well. At the end of the day this is just another CRUD application, just sexier. It could help a lot people track their own skills and learning in their own way. • skilbot: The bots are back in town, or at least they will be. I’m not talking about some lame Discord bot (although that is still a fun project for people to learn with). I’m talking about the return of the concurrent skills attendants that monitor what a person is doing on a real computer and chime in with information, challenges and the like. In 2016 it was the most innovative thing I’ve ever done in terms of direct skills learning and yet I let it fade due to other requirements. I was also still thinking in proprietary terms and even coded in self-destruct times into each bot. Time to bring it all back with a fresh look. I love that the bots are really just system monitor utilities that present a dashboard to the terminal and local web as well. This means I can use them for my own little CTF-like competitions. I’ll need to be sure that the source of all the bots is public so people can download and checksum validate their own lest they thing I’m trying to hack them, at least for the public. I like exercism.io but it leaves many things wanting. • gits: The one Git services command-line utility to rule them all. The world needs a single utility that deeply understands everything offered by the vastly different APIs of GitHub, GitLab and any other Git hosting service. Only then can there be balance. Only then can we execute a single command to migrate an entire repo from one service to the other without engaging a GUI at all. • live: An edu-streamer/live-coder’s favorite command line tool. This will build on my popular twitch script (which has grown over to well over 500 lines of bash) and integrates the Restream, Twitch, and YouTube APIs with possible others to come later. There shouldn’t be much work here since Restream’s API takes care of most of the heavy lifting in terms of syncing everything up. This will include a server/daemon mode that displays colored chat to the terminal suitable for including in a TMUX pane. • k8s/k3s Kit: Before one of my long-time community members graduated (got a DevOps tech job at 18) he started a Rasperry Pi kit to simulate and help in the training of actual Kubernetes setup. Such a thing is rather hard to learn and truly master without actual hardware. Virtual machine software can have unforeseen conflicts with the low-level hardware and emulation technology required for it to all really work. Pis are perfectly suited for such things. The idea isn’t novel. It’s just useful. I really want to write a guide with accompanying videos about how people can train themselves up to setup and maintain a full, local cloud and experiment with microservices setup and communications with gRPC. Such a thing is particularly needed because beginners often have no way to get actual experience doing DevOps work, which normally requires working for a reasonably sized enterprise. Just like I ended up being named the corporate mail administrator after sheepishly raising my hand when asked in a group if anyone had SMTP mail experience (even though I just maintained my own mail server at home), so too can such individuals position themselves with at least initial DevOps experience to open similar opportunities. • Doris’ Sculpture Microcontroller Project: Doris still needs to get some sculpture articulation working and has all the motors and electronics for it. I plan on helping her get it all working with C which will require some good concurrency patterns because of the coordination between all the stepper motor modes. Here’s a list of all the tech on display in these projects: Golang • GraphQL • Redis • HTML • CSS • JavaScript • REST • WebSockets • Cview • Terminal UI • Linux • Bash • “Full-Stack” Web Development • Vue • Progressive Web Apps • HTTP • YAML • JSON • JAMstack • Networking • Hosting • Concurrency • Data Structures and Algorithms • Parsing Patterns • Language Design • Domain Modeling • Docker • Kubernetes • Devops • Microservices • gRPC • IoT • C • Rust If that doesn’t make me able to command a senior consulting and development salary in the$200K I don’t know what will. In fact, after finishing it all and putting it in a nice pretty portfolio package I’m sure I have enough to go entirely freelance after a few years of contracting. Of course, I’ll be maintaining my highly selective mentored community during all of it and continuing to create educational content in video, audio, and written forms — maybe I’ll start my own publishing model of buying it and getting forever access to the accompanying knowledge base. If No Starch or whatever publisher doesn’t want to go for that it would not be too hard to self-publish today — even with paper versions.

There is something I do really need to learn: Linux distribution package creation. I made tons of rpms at IBM, but I have no experience creating the others. It won’t be that hard, but I really need to make that second nature based on the amount of code I’ll be putting out.

• Debian (deb)
• Ubuntu (apt)
• Arch (yay)
• RedHat (rpm)
• Suse (zypper)

Come to think of it. I need to stream how to create packages and polish up those videos for others to consume as well. That will further two goals as once.

## Wednesday, September 16, 2020, 1:18:50PM

Putting together some notes on skills that I know that I need to polish up specific examples of so I can justify the target contract jobs of those types. So far looks like the most in-demand Go work (that people are actually paying for) is in the “full stack” and required back-end middle-ware and server software:

1. Golang
2. GraphQL (with REST understanding)
3. “Full Stack” Web Development
4. Microservices
5. gRPC
6. Linux

I haven’t found any cybersecurity senior software engineering positions but I’m sure they will eventually show up. If not, maybe I’ll just need to make a flagship product or two.

## Wednesday, September 16, 2020, 7:58:00AM

I really need to get the base RWX KnowledgeNet template cleaned up since I keep using it for different knowledge base sites. Hell, I even use it for this site. Just so much work to do.

I thought of this because the PEGN specification documentation turns out to just be yet another one-node knowledge base.

## Tuesday, September 15, 2020, 5:45:55PM

Once again I’m reminded of just how right I got the whole SkilStak thing in the beginning by having a centralized multi-user server.

Today the main advantage I was reminded of was that beginners are not required to learn anything but how to connect with ssh from their computer — no matter what operating system they have — and then the basics of bash shell navigation and vi editing.

Most importantly it means no fucking around with Git and GitLab and GitHub — or even an email provider for that matter. Just pure terminal mastery from day one.

I always shy away from mandating terminal skills for web development only to return to the conclusion that they are required so might as well learn them. Creating a web page and having it immediately viewable without any extra steps is astoundingly effective for engagement. This is what REPL.it and others claim but they bring a shoddy web editor dependency that always fails. A minimal ssh connection never does.

Backups are trivial to create of users’ home directories.

SkilBots are easily deployed and maintained.

Login notices and ASCII art make it fun.

And Git and laptop Linux installation can always be added later. In fact, I know feel it was a mistake to start beginners out with a Linux laptop as a requirement. They certain learned a lot, but in the end they have no memory of what they did to set up Linux on their workstations. Until they care enough to want to do it over and over on old hardware and the like — and many may never achieve that level of interest before leaving — insisting everyone have a form of Linux installed is simply too much overhead for what it provides to beginners.

I do wonder how I could leverage all of this for beginners who are not in my private mentoring group, for example, those that I have helped through beginner boosts. The best I can think of is to suggest they use the PicoCTF stuff and make sure they work on levels while there. The Carnegie Mellon admin team is top-notch and has a server supporting over 32,000 users.

The only down-side of suggesting PicoCTF server is the doxing of IP numbers, but that would happen in any situation where a multi-user system is involved.

The biggest difference this time around is that I refuse to create any crutches — including a save command of any kind. Anyone on the system will have to learn git properly and all that goes with it. Instead, I’ll create SkilBots (eventually skilz.sh) to help them learn it.

Another option also occurred to me for multi-user anonymity, I could create a lightweight server with ssh tunnelling enabled so that the machine with the users on it would never give up the IP of those connecting. Hummm.

## Tuesday, September 15, 2020, 9:55:06AM

It occurred to me while walking through a simulated parsing session in order to check my PEGN for the AST JSON that it is perfectly possible to create a tool that visually steps through the PEGN and the parsed grammar in two windows showing the process of parsing for educational and debugging purposes. It’s not even that novel. Step-debugging is already a thing for most languages but it impossible without at least the possibility of creating an AST that can be traversed.

There’s just way to many cool things you can do once you have a solid meta-language grammar.

## Tuesday, September 15, 2020, 8:07:59AM

Just discovered magefiles, a Golang makefiles alternative. Very intriguing. I’ll have to get good with them. I am particularly interested in seeing how they can be combined with CI/CD to produce binaries (like pegn) automatically for all platforms.

## Monday, September 14, 2020, 6:24:13PM

I am back to stuffing all my notes and rants into a single file per year. It’s just easier all around: makes searching a breeze even without JavaScript enabled, flows well for those reading it, and will be valid for a good year without breaking incoming links. After a year I kinda want to break any incoming links since they are likely way out of date.

As for scrolling and user performance? It’s just a bunch of text so it should be fine plus it makes me more aware when I write too much that belongs either in a video or separate article.

## Monday, September 14, 2020, 6:07:50PM

You’d think I would pay more attention to the quality of this site. Mostly its just a place to brain dump and take notes. I certainly don’t consider it a shining example of my web development prowess. I can’t even show my good stuff because it was all behind Nike and IBM closed corporate doors, which reminds me.

Yesterday a very experienced peer on Twitch was sharing how much of his best work will never be seen by anyone. We all got to talking about it and eventually concluded that those who judge talent by their GitHub/GitLab profiles are actually pretty damn foolish, like Google for example. We had some people contacted by Google solely for their activity on GitHub.

How fucking stupid is that?

Any rational recruiter of talent would understand the obvious reality that the best candidates either cannot or do not post their best work publicly.

Once again a main-stream belief is obliterated with objectivity and experience. Sure it is good to clean up your profile and have a good-to-gig public persona, which I am intent on cleaning up now that I’m seeking consulting work. But my God, you have to be dumb as a stump to think that “the best people always have a GitHub profile.” It really just shows how clueless and desperate you are as a company and recruiter. I’m ashamed I have put so much emphasis on it myself when mentoring others.

I shared an experience where a very senior executive shared with me in confidence that they are paying top dollar to some top-secret machine-learning-driven recruiting filter service that mines the “top talent” through some sort of complicated heuristic involving just GitHub. I’m starting to think the reason it was so top-secret is because if the company name made it out into the public they would be laughed out of the industry, along with those stupid enough to pay them the exorbitant amounts they demand.

Nothing will ever beat a personal connection and work evaluation. Git services are just one part of that.

## Monday, September 14, 2020, 4:41:56PM

I’m again reminded how superior a simple, multi-user cloud Linux machine from Digital Ocean is to REPL.it. Once again we had several problems getting REPL.it to work with any degree of stability. Instead, I’m able to promote terminal usage and ssh skills while sharing a terminal using tmux with those I’m pair programming with and mentoring. Why the fuck do people have to always overcomplicated things that were never needed in the first place?

## Sunday, September 13, 2020, 9:59:24AM

Need to read Good Enough Software from Ed Yourdon. Someone mentioned it after I went off about being called a “water boiler” by worst manager of my life at IBM just for raising critical issues with a bug I had notified people about in email before. God I’m glad I never work with that asshole again. I had so many other wonderful managers at IBM. He was not one of them.

## Saturday, September 12, 2020, 10:02:03AM

Decided to go for multi-streaming over Twitch subscriber building and stick with just coding the software that is needed focusing on learning through answering questions and demonstrating by example rather than some particularly video curriculum. Watching the Bill and Ted WiseCrack video really helped me decide that this is better overall for everyone’s learning. When I write learning material down it will mostly be references to where to find stuff, like the phone book or user manual in Bill and Ted’s phone booth. “You’re on your own, gentleman.”

I’ll have to change a few things about streaming:

• Need to be more aware that stuff is being viewed everywhere

• Might have to drop Twitch “affiliate” status but who cares I barely crack 20 viewers at a time when I’m coding but the people who join are so much better making for a much better “studio audience” for the rest who I won’t even see in the chat. You have to be good enough to find your way onto my Twitch channel to even join.

• Do still need to write a mark-to-youtube tags converter so that I can just upload the text file and get the same marks in YouTube since for some reason Twitch does not push them automatically. They are pretty new YouTube additions, so maybe that is in the works.

## Saturday, September 12, 2020, 7:59:09AM

I’m reminded of the elegance of the idiomatic Go directory organization for commands and library packages. Having a cmd and then putting another directory with the name of the command (pegn) allows the inclusion of user-land functions in between, inside the cmd directory itself (lint.go). This keeps these files from cluttering up the clean library directory at the top level containing them both. But even more valuable is the ability to write tests and benchmarks within the cmd directory for those commands so you don’t have to recompile the command itself every time you want to test and build some sort of additional tests against the command execution instead of just another function call. This meshes perfectly with my cmdtab package organization as well that promotes putting each subcommand into its own file.

## Friday, September 11, 2020, 8:37:50AM

After much consideration I’m all but decided to work for the really good contracting company of some kind. Discussing it with great people doing it professionally on the Twitch stream has really convinced me that is the thing for someone with my diverse and extensive experience to do. Hell, I may even be able to use my non-computer language skills.

I’ll be in finish up and clean up mode for at least the rest of 2020 as I wrap up projects, dust off and polish the old ones, and throw out the really old stuff. I also need to make extensive updates to my LinkedIn and resume. It is not an exaggeration to say I’ve been so busy helping others that my own good-to-gig status has fallen behind some — not from lack of experience — but from lack of updating what are essentially the personal marketing materials everyone has to maintain to garner interest from potential compatible employers.

## Friday, September 11, 2020, 8:35:51AM

Woah, it’s September 11th. Just noticed. That certainly brings back poignant memories.

Found the part where Bryan Cantrill talks about his obsession with old languages.

## Wednesday, September 9, 2020, 6:31:28AM

Been giving a lot of thought recently to what sort of employment I really want now that we’ve decided that I will be seeking work outside of SkilStak in order to get a home (instead of renting as we have been doing). As usual, it is hard to isolate what I would like to do from all that I could do — especially with all my experience at this point. I find myself following my own council from my mentoring and streaming:

1. "What is your mission in life?
2. “What activities do you enjoy doing daily?”
3. “What careers involve those activities?”
4. “What companies employ for those careers?”
5. “What do these companies specific need to see to trust me?”
6. “What core skills are most important for those careers?”
7. “Which core skills do I know or need to learn?”
8. “What is the best way to learn those skills?”

As I ponder the perfect position I’m allowing myself to consider the sort-of anti-employment as well. A friend of mine on the Twitch stream insists that I should freelance. I really want to consider it. But I need to meet the needs of employment to pass for a mortgage and that is incredibly hard in today’s climate without a traditional employer.

### The Perfect Day

I already have the perfect job. I wake up. Make coffee. Blog and code and live stream while I’m doing it. I have a one-hour educational stream for everyone on Twitch during that time and later clean up that video for release onto YouTube. I organize those videos into a living curriculum covering beginner boost topics as well as highly advanced software design put into practical terms. My evenings I spend time directly mentoring and pair programming. At least two hours is spent outside just walking or slow running and pondering life and projects. Another 90 minutes I spend doing Yoga. I walk the dog and spend time with my wife throughout the day. I read in the morning during breakfast and at night going to bed. And I am paid for all of it either directly or by patrons who want to see my projects completed.

How do I get someone to pay me to continue to do what I already do?

For most the answer is “become a professor” but that path is so fraught with problems it is unthinkable. It is such a tragedy that education institutions suck so completely today.

No, I need something that allows me to be professor-like but receive the remuneration of a corporate employee. Unfortunately, such companies are large and usually very evil, Google, Facebook, Amazon, Microsoft, IBM. Ironically I find IBM and Microsoft the most palatable of the bunch. IBM would want to patent everything I have been making lately. Microsoft, is, well, Microsoft.

### Cybersecurity Engineering

A natural extension of my expertise creating large-scale applications for systems monitoring and audit compliance is cybersecurity engineering. It involves engineering software (and sometimes hardware) solutions for both penetration testing and defensive remediation. I can totally do that shit (and already have).

My Golang mastery really compliments this because Go is the perfect language for such applications for all the obvious reasons that anyone serious in the field would immediately recognize.

### Perfect Title

Senior Software Architect specializing in SRE and cybersecurity solutions particularly involving grammar and knowledge design and development.

## Monday, September 7, 2020, 10:27:42PM

Noticing during the process of AST and grammar design that a few questions help to decide how to organize your tree and nest the nodes:

• Would this make sense as an attribute (if attributes were supported)? If so, it should likely be a subnode of a parent. In fact, anything that could be represented differently by attributes should all be subnodes of the same parent node type.

## Monday, September 7, 2020, 9:36:02AM

I really need to include “The Mormon Fixer in St. Petersburg” as a chapter in the book I eventually write about being Mormon. The mission president in St. Petersburg also coincidentally happened to be my favorite and most influential professor in all my time at Brigham Young University. He was also quietly very liberal. Sometimes I wonder how he managed to stay Mormon having written the plays he had. I once had to defend him to his own students who about revolted with he came to the part of his history course where he presented the African coming-of-age myths involving 90 foot penises. Oh my god, I’m still laughing about it now. These little straight-A Mormons couldn’t handle the myths of the world and revolted almost getting him in trouble with administration (again). He had a long record of rabble-rousing. Maybe that’s why he was my favorite.

## Friday, August 28, 2020, 7:23:50PM

Oooo, the best WSL2 upgrade guide yet:

I’m seeing people that I mentor be completely unable to run Virtual Box on Windows presumably because they once used WSL and not their system is in Hyper-V mode. After the huge deal it is to get any virtual machine running on Windows once you have ever used WSL or WSL2 it is pretty much mandatory to use WSL2 from that point on. WSL is disastrously bad having fundamental bugs as stupid as the sleep command being unsupported. Thank God for that WSL2 upgrade guide. We’ll see how it goes.

## Thursday, August 27, 2020, 5:19:01PM

Added the following to all invoices going out for the rest of this year.

Beginning Jan 1 2021 cost for all blocks will increase from $800 per block to$900. Enrollment is now capped at 25 total members. Applications for the waiting list are always available for other preferred time slots for current members as well as other potential members. Member referral applications are weighted greater than others. Thank you for spreading the word all these years.

## Thursday, August 20, 2020, 7:24:10PM

Once again I’m learning the shortcut usually is the wrong thing to pick. This time it’s WSL — even WSL2. Let’s face it. No one is going to use either of these in production or even as a professional workstation either as a software engineer or SRE / DevOps engineer.

Suffice it to say, I’ve having everyone revert back to using a virtual machine approach for all Linux work if they don’t have a box to install it on. The clear advantage is that they can actually take a backup of their virtual machine and install it anywhere else they might want to take it. It also means that they are learning real skills for testing out different operating systems.

By the way, PopOS on Virtual Box is a breeze for every beginner I’ve helped get started. Once they figure out how to wrangle the screen resolutions to see everything it works out very nicely.

## Wednesday, August 19, 2020, 8:41:08PM

Need to add sitemap.xml generation to kn build since it contains information similar to what is in the MANIFEST file but is recognized by Google and other search engines.

## Wednesday, August 19, 2020, 6:43:10PM

I’m chuckling as I go through the latest list of Go resources (books, videos) out there and every time I want to slap my forehead because the introduction is so stupid or the title so completely off base it turns out to be from — you guessed it — Packt publishing. Books from the publisher are so laughably bad. I cannot warn people enough to never give that publisher a single cent of your money. I could write a full booklet just on all the shit I’ve found in different Packt publishing books of all kinds. It’s because they flatter lesser developers into writing a book and flood the market trying to get books out there fast without any attention to quality.

## Wednesday, August 19, 2020, 6:26:55PM

I sometimes struggle with reading a particular name of a developer or technologist turned author or popular library creator or YouTuber who has really become well known — even writing forwards for shitty books — while knowing deep down how ugly their code really is because I’ve actually read it. Even if very few realize just how horrible it is, I know. I mean sometimes it is absolutely brain-dead stupid bad in some cases, and yet everyone uses it without knowing because they don’t look at it. It’s one of those opportunities to channel Rob Pike and just keep my feelings mostly to myself. (I get the sense that he probably has done that a lot during his life.) “Don’t get mad, Rob, get busy. Don’t get mad. Get busy.”

## Wednesday, August 19, 2020, 5:23:53PM

Here’s what I’m adding to all invoices these days:

Based on the economy and market value, beginning Jan 1, 2021 the cost a block will increase from $800 to$900. (See Codakid https://codakid.com/private-online-coding-classes/ offers private mentoring from college kids using non-professional tools and skills for a comparison.) I prefer maintaining a lower rate and promoting long-term relationships but this adjustment is required now that only accept 25 maximum into the community.

I could change $100/hour if I wanted and easily get 25 people from the professional ranks, but I’d rather keep the rate down and focus on long-term relationships with a select few. ## Wednesday, August 19, 2020, 8:32:26AM Put a few things on my wishes list, mostly stuff that is obvious to anyone who knows me since I’ve been working on it for so long: 1. Better PEG Grammar Notation and Tooling 2. Fast, One-Pass Parsable Markdown (Ezmark) 3. A Decentralized Knowledge Network 4. Reformed Public Education Focused on Learning Imma keep plugging away. ## Tuesday, August 18, 2020, 8:22:49PM I am so tired of else statements. In fact, when I see people using them you can almost be sure they are junior programmers, but not always. One of the things I love most about Go is that most of the standard community libraries shun else and else if like the plague. I can honestly say that in five years of Go programming I have never needed an else if — not even once. I have used else on very rare occasion. I actually just ran across something similar to this in advice on the computer science stack exchange: [Why do people actually read that shit?] go if blah { return "something" } else { return "something else" } This is completely stupid and forces the compiler to work unnecessarily hard. I really love how triggered something as stupidly trivial like this makes Rob Pike (my hero). Obviously the right way to code the above is like this: go if blah { return "something" } return "something else" Why do so many lesser programmers have an issue with this? It is so much easier and cleaner and avoids unnecessarily nesting keeping the code left-justified as much as possible. The thing that triggers me the most is what the first represents in terms of code design think. It discounts that there is one best/default path through the code, one blue sky scenario, and all the ifs are exceptions to that. Here’s another comparison in Python: python import sys if len(sys.argv) >= 2 and (sys.argv[1] == “Rob” or sys.argv[1] == “Simon”): print(f“Woah {sys.argv[1]} you rock.”) elif len(sys.argv) >= 2 and (sys.argv[1] == “Dork”): print(f“Um, no need to be rude.”) elif len(sys.argv) >= 2: print(f“Hi {sys.argv[1]}”) else: print(“Hi there.”)  And as God intended: python import sys if len(sys.argv) <2: print("Hi there.") exit() if sys.argv[1] == "Rob" or sys.argv[1] == "Simon": print(f"Woah {sys.argv[1]} you rock.") exit() if sys.argv[1] == "Dude": print(f"Um, no need to be rude.") exit() print(f"Hi {sys.argv[1]}")  By the way, the single biggest evidence that the Python language designers have always had their heads up their asses is the absence of a switch statement after 20+ years of Python (and of course significant white-space). We got fucking f-strings before we got a switch statement. The reason is obvious and the same reason multi-line anonymous functions still don't exist. White-space fucked them. Read the archived emails. It completely justifies that statement. ## Tuesday, August 18, 2020, 7:35:30PM I'm reminded of how successful the challenge pedagogical approach I've been using and tweaking for years has been received by learners of all ages. I do wonder if there is any formal academic science on my method. I can say it has been following the scientific method of discovery and revision even though any measurable metric has been the success and deliverables of those learning and not some other formal measure. The method goes like this: 1. Identify the core concepts and skills. 1. Create challenges that combine the concepts and skills with very specific requirements in bullet form that can be tested with or without automation and understood by someone who is non-technical. They won't know it, but these are akin to [user stories](https://duck.com/lite?kae=t&q=user stories) then will encounter on the job. 1. Keep the challenges small, memorable, entertaining, and even silly so they produce immediate dopamine responses promoting learning and unstressed confidence and can be repeated as with exercises (and not so much with larger projects, which are an important pairing as well). Repetition makes humans feel safe which is why we obsess about it in early childhood. 1. Provide helpful hints on how to discover useful information about how to solve the challenges without any specific source in mind. While a preferred text is fine, consider listing several sources --- even those with contradictory approaches --- to promote critical thinking. 1. Outline the challenges in way that *generally* builds and reiterates previous skills through repetition so the learner gains experience and strength along the way. 1. Don't teach. Let the learner teach themself. Oversee the learning by clarifying the requirements and tweak them as needed through a sustainable challenge versioning system. This way the learner gains the confidence not only in the material but their autodidactic ability to successfully research, log, and learn on their own. 1. After a learner completes a challenge test the learner by having them walk you through how to complete the challenge without looking at their notes as much as possible. Being able to teach someone else how to do it is the best measure of mastery. Use the opportunity for discussion and clarification. 1. Have the learner store the completed challenges and notes they took along the way to accomplish it in their own personal learning logs and codebook repos for reference later. This *challenges and projects* method models learning in the real world and prioritizes mastery of one's own learning skills during the process. ## Tuesday, August 18, 2020, 7:30:53PM Looks like we don't really have to worry until the house sells. So we are just packing up and getting ready to move quickly when the time comes. It sucks because it deflates a lot of Doris's creativity and depresses us because the wonderful yard she has created is just going to rot but it is better than living every day as if we might have to move out within the next seven days. ## Saturday, August 15, 2020, 7:06:24AM I really like the approach of designing the Golang interfaces before doing any of the concrete implementation. It is easy to think that its just a hassle, but it is a good separation of mental concerns phase. 1. "How am I going to use this thing and what do I need?" 1. "How can I implement it the best?" This is the kind of thing thing that is simply not taught even in most computer science departments. ## Saturday, August 15, 2020, 6:03:24AM Because of the COVID crisis new financial policies for all real-estate have been adopted (according to someone representing the mortgage agency with whom Doris spoke). He described the reason that even though I bring in from \$70-\$124k over the last eight years have started and run a successful mentoring business that we cannot even qualify for a \$150k
home loan despite the lowest mortgage rates in history and our average
credit. We can't even rent anything for over \$1000 a month. Why? Apparently the new policies group the self-employed with the unemployed. He described how this is likely to result in massive homelessness over the next year. People are starting to be evicted en masse who can't pay. In our case I have never once missed a rent payment and am having our home for the last three years "liquidated" by the land lord despite the good faith he expressed about our desire for long-term rental. We paid for the first AC repair, we bought the fridge, and Doris has amazingly transformed an flooding nightmare of a yard into garden that at least one stranger has actually stopped and commented on because it is so beautiful. She put at least \$2000 into that yard. We have always
expressed a desire to eventually buy. The landlord just doesn't care. I
won't judge his needs, but I can still fucking hate him anyway. He's too
cowardly to ever speak with us directly about anything.

So looks like there's only one option left. Move out far enough away
from *everything* that we can be accepted for a \$800/month apartment --- even though I make way more than that. We'll have to put all of Doris' art and most of our belongings into storage and live with nothing but enough room to sleep, eat, and work on a computer or sketch pad. Another thing that is clear is that I simply cannot ever get a mortgage without going to work for someone else no matter how much I can prove I have earned. I suppose I'll be okay. I am certainly employable even though I've 52. My skills have never been sharper. I do have a lot of portfolio cleanup and resume updating to do however. It might sound strange, but I'm entirely calm. I've come to expect massive surprises like this in my life. But someone things always work out --- for the better. In this particular case, had I not moved to 100% remote sessions and got them really working well, had this surprise came even one year ago, I would be facing quite a bit of challenge because I would likely be immediately unemployed. COVID might have changed the lending policies, but it has made everyone much better with remote work and no one does remote work better than me. I've been doing it since the year 1998. My biggest regret is that Doris' hopes and dreams have taken a *huge* hit. There are lots of ways for her to do art and connect with the community, but her studio is gone, for now. On the bright side we might finally actually get some kind of insurance through an employer, because having your employer responsible for your physical health and that of your family just makes so much sense. "What's this? You want to leave us and work on your own perhaps as a competitor? Good luck finding a doctor for you and your family." It's straight-up illegal, a horrible conflict of interest. I'm sure nothing will ever go wrong with *that* idea. I laugh because I'd cry otherwise. America is *so* completely fucked up. So many people in America just need to fucking die so that young people who have been screwed by them for the last 20 years can do something about it. They are certainly fed up. ## Thursday, August 13, 2020, 5:45:14PM Today I ran/shuffled 17.4 miles in 5 hours and 20 minutes --- a 18 minute/mile pace --- but that's running 90 minutes and taking a 10 minute yoga break and then doing another one. My actual running pace hovers around 15 min/mile. I didn't even feel nearly as uncomfortable as on other shorter runs. I think the secret was actually the shuffle keeping my blood flowing without hitting lactate thresholds, and of course the yoga to do what yoga does. The shuffle is *so* much easier on the feet when running in my barefoot Merrill's with Vibram souls. Now that I know Cliff dominated with it I care not that I look like an old guy who needs to use the bathroom. My feet are so happy. Have I mentioned that my feet are getting fucking buff! It's hilarious. They are actually quite a bit bigger. As you can imagine, the rest of me is *quite* a bit slighter. My belly is almost completely gone. It's a great feeling even though it has taken more work to get back in shape than I have ever faced in my life. I can't believe I *ever* stopped being Team Endorphin Rob. I've *always* been an endurance athlete even during the more stressful of times. *NEVER* again! Only scrawny, vegetarian "soy boy", Patagonia, yoga/endurance dude for the rest of my days. All this has had one overall significant affect. I am just so fucking happy all the time --- no matter what hits us, like being kicked out of our apartment after finally personally investing thousands into the yard. Some how I know everything will turn out okay. ## Tuesday, August 11, 2020, 6:08:34PM Need to incorporate the Anki functionality into the K|N (knowledge net) tool --- especially the reminders to study certain things. ## Monday, August 10, 2020, 7:23:13PM <https://regex101.com/> looks like a great regular expression learning resource. ## Monday, August 10, 2020, 4:53:18PM Going to have to raise the price of a block from \$800 to \$900 in January. My new curriculum will be done by then, plus the economy has changed a lot and the demand for online education has really increased. It puts the cost for a session at \$56.25. Considering that a 30 minute
private piano lesson or voice session is \$30 I'm still *well* within the affordable range --- for those who *value* strong technical skills. ## Saturday, August 8, 2020, 7:49:11AM Even if PEGN doesn't grow into anything anyone else would ever use it has already helped me immensely. Here's an example of how I can simply and easily incorporate it into *any* communication about the syntax of a thing: go // Parse returns a top-level pegn.Node (Grammar,1) with four attributes. // The first is the universal language identifier (ULI or UniLangID) in // all caps "PEGN". The second, third, and fourth attributes are the // semantic major, minor, and patch version numbers. No support for // semantic version extensions is supported or planned. The ULI, // however, may contain an extension indicated by a dash (-) followed // by any alphanumeric ASCII characters. [UniLangID <- upper{2,12} // (DASH alphanum{1,20})?] func Parse(p pegn.Parser) (pegn.Node, error) { // TODO return nil, nil }  My addition of classes (upper, alphanum), tokens (DASH), and limits ({2,12}, {1,20}) allows me to quickly and easily communicate the *exact* specification without the complications of something like BNF notation. PEGN is simply the most readable technical specification language I've ever encountered, even if I did take Bryan's idea and ABNF and just combined the best of them together. But synthesis is often the essence of innovation. I've been watching Voyager and have been on the episodes containing Leonardo Da Vinci and I find myself relating to his frustration with "idiotas" who see only a person falling into the river trying to fly and not the thoughtful, courageous, scientifically creative "maestro" at work. Am I "maestro"? Hardly like he must have been. But I relate to his frustrations which are not dissimilar to Babbage's during his life. Reading *The Innovators* has also put me into a perpetual imaginative state where I am regularly in the presence of *true* greatness and innovation even if I struggle to produce my own. The world is full of absolutely amazing human beings who are almost always impossible to find. They are too busy getting shit done. I just need to mentally place myself in their company I also must avoid those who think playing a video game 10 hours a day is any way to expend the precious time granted to them. I love video games as a art form, but you don't see me spending 10 hours a day fawning over any art. There is art appreciation and then there is self-distraction and addiction. With so many other opportunities to learn, grow, and connect why waste time on such things? The secret (for me) is to learn to "cancel" the rest who insist on such monumentally tragic wastes of time, who throw their fucking lives away for whatever reason, no matter how justified. I want nothing to do with them. They frustrate and trigger me. This is why when I start to sense that a person I mentor is gravitating toward just wanting to make and play games all day I tend to find an excuse to get rid of them relatively quickly unless they are young and don't know better. ## Friday, August 7, 2020, 12:24:18PM [49:50 Antiquity Loop (5 miles)]{.run} All the long walks and runs over the last three months have really helped out. I'm back to 10 minute/mile pace (6mph) in my flats. Wonder what my PR would be in some good shoes that are broken in. Felt *so* amazing today. We'll see how recovery goes, but I'm convinced that study was right, the best combination for overall health is 3-4 runs like this one or faster combined with every other day of recovery walking and minimal strength and yoga. ## Thursday, August 6, 2020, 8:14:33PM After not more than three years in our current rental our landlord --- who assured us that things would be fine when we moved in --- is kicking us out to "liquidate" this "asset". Fucking asshole. We moved here from SkilStak before and had held in-person sessions until January when I went to fully remote. Thank God I did. Had I not I would not be looking at full unemployment and the loss of most of the remaining business I have maintained within about a month when we have to leave to a location that will *not* be easily accessible to anyone currently attending remotely. In other words, as usual, the Universe has our backs. I've been do nothing but what I *know* is the best use of my time and energy for this shitty world and somehow I have avoided major catastrophes. In this particular case, it is as if I was gently nudged in the right direction and always prepared for what was about to come around the corner. I even started streaming and working on the remote session workstation stuff before any of the Covid stuff hit. I've even been focusing intensely on getting my cardio health back in line over the last three months so I am in great shape to pull of yet another move. I have no doubt whatever is coming next is going to be *better* than what we have now. Karma is real. ## Thursday, August 6, 2020, 8:08:42PM Doing the *Empire* levels of PicoCTF 2019 and realizing just how much I need to add SQL training to all my would-be pentesters. SQL is essential for most development still these days (despite my love of JAMstack and YAML for all things small and flat). Once again I have motivation to finish everything in the critical path leading up to being able to make *wicked cool* and fast to write challenges and books on the RWX Knowledge Net: 1. PEGN (pegn) 1. PEGN Language Server Protocol Implementation 1. vim-pegn 1. Ezmark 1. K|N 1. RWX.GG ## Wednesday, August 5, 2020, 7:50:03PM WSL [sucks](https://askubuntu.com/questions/1230252/sleep-doesnt-work-on-ubuntu-20-04-wsl) (and possibly 2 as well). Just wasted a bunch of time debugging a simple loop with a sleep in it only to find that it is because WSL is fucking trash. Now I have to port all my people back over to running PopOS in a virtual machine which is what I started out doing but was tempted by the ease of install for the WSL stuff. I am *so* tired of being surprised by broken shit that should obviously work like this. I will *never* recommend WSL to a beginner ever again! ## Wednesday, August 5, 2020, 5:11:00PM I could not be more pleased that my Filter(r rune) bool type from PEGN is compatible directly with the entire unicode package from Go. This means that anyone can use any of the comprehensive library of functions as a Filter passed to the Consume(f Filter) method of the pegn.Parser interface. I love it when a good plan works out. ## Wednesday, August 5, 2020, 4:54:19PM I take that back. The Apache 2 license must be included as well for those times when I want to promote adoption by the most legally paranoid organizations. It placates their worries much better than The Unlicense, which I'll use for informal things. So that means: 1. The Unlicense (permissive, informal) 1. Apache 2.0 (permissive, formal) 1. GPLv2 (non-permissive, formal) ## Wednesday, August 5, 2020, 4:39:42PM I've decided there are only two software licenses I care to ever use for software: 1. The Unlicense (permissive) 2. GPLv2 (non-permissive) I figure people are going to do whatever they want anyway if you make your source code public, so putting *any* license on it is just a formality for the few times when someone will actually care about it. Ask any developer in China what they thing about the GPL? Nevermind, you can figure that one out and no one will even answer that question anyway, they are too busy stealing all our best software and making proprietary changes to release products that destroy ours at market. Licensing, like so many things, is something they just laugh at in our faces, like that one scene from Mr. Robot. But it's not just China. Apple regularly steams ideas and stuff from open source projects without letting anyone know. The reader they implemented in Safari destroyed one such project. And when is the last time you saw *any* attribution in any Apple screen. In other words, none of it matters as much as I (and others) worry about it. If you *want* people to give back their changes to in in GPLv2. Otherwise, might as well just make it Public Domain. MIT and BSD will just let them compile it in a way that no one can actually see that they are in violation so we could *rarely* catch them anyway. When I want permissive, I want it to be *really* permissive. When I want to ask people to share their changes (and that's all) GPLv2 is really the only license in town for me. ## Wednesday, August 5, 2020, 4:09:42PM I'm seriously thinking of writing (another) book, *Learning to Program in Go as a First Language*. It would be fun and easy and take the same tone as *Eloquent JavaScript*. It would be entirely based on my challenges approach and mix in the essential concepts and terminology as it comes up. That way there is always something going on to stay engaged. Having recently really looked hard at Rust and also been doing a lot with Go for PEGN I really feel like Go will be the *true* dominant language of our time clearly taking the place of Python, Node, Java, and much of C++ and some C. I really believe learning to code *first* in Go would put the learner light-years ahead of others by introducing types, when to make them strict and when not, why to think in terms of what a thing can do (interfaces) as its type rather than its underlying structure, and to think internationally from the start in Unicode runes rather than just bytes. Plus understanding the concurrency model (goroutines and messages) doubles as preparation for how the Linux operating system works. ## Wednesday, August 5, 2020, 4:01:21PM Skype has blown Discord away in professionalism, features, and stability. I'm actively moving all SkilStak operations *off* of Discord in any way. I will maintain the community there for RWX.GG but that is it. Discord is an absolute disaster of a company and product. I once admired their company spirit and such, but after multiple, objective indications of how stupid their company is I have to stop. Their engineer team is full of absolute buffoons with almost zero experience who chose to blog about just how stupid and inexperienced they are regarding their idiotic choice to pick Go in the first place and then bash on it while deciding to switch to Rust, and then later chase Erlang. I don't want any of my people anywhere near that catastrophe of a company. ## Wednesday, August 5, 2020, 8:31:53AM Been reading a lot about running for those over 50 lately. A lot of great insights from other runners [in the comments of this page](https://trainright.com/tips-for-aging-runner-ultramarathoner-50-60-70/). Another [study](https://www.active.com/health/articles/why-too-much-running-is-bad-for-your-health) that tracked 52,000 people over 30 years has some *very* interesting results: > However, the health benefits of exercise seemed to diminish among > people who ran more than 20 miles a week, more than six days a week, > or faster than eight miles an hour. **The sweet spot appears to be > five to 19 miles per week at a pace of six to seven miles per hour, > spread throughout three or four sessions per week.** Runners who > followed these guidelines reaped the greatest health benefits: their > risk of death dropped by 25 percent, according to results published in > the journal Medicine & Science in Sports & Exercise. That worried me a bit at first since I've been doing 10-13 miles a day for the last few months. Then I read this: > If you want to work out longer than 60 minutes a day: After the first > 45 to 60 minutes of vigorous exercise, switch it up by doing yoga, > strength training, or lighter activity like swimming--and don't race. > > Stick to walking or Yoga for one of your regular workouts for one > extra day each week. In other words, while trying to lose weight and wanting to keep more active during the day --- without going overboard --- walking, strength training, and yoga (my favorites combined with paddle boarding and adventure cycling) are the key. My conclusion is to keep up the four hours a day of >55% heart rate activity everyday and for three days (Monday, Wednesday, and Friday) go faster for 60 straight minutes after warming up with a walk for about 15 minutes first. (My walking pace is 16 minutes per mile.) I'll be cutting my walks down to two hours (about 7.5 miles) and sticking to shaded trails so that I don't have to pack as much water making logistics easier. Then I'll hydrate when I return and immediately do 60 minutes hour of Yoga, a variation on Ashtanga including "yoga pull ups" and 12 minutes of svasana (laying on your back). Then I'll do my regular 10 minutes of trataka (candle gazing meditation) with tones. One of those days a week I'll add 60 minutes of 8-10 minutes per mile running and work up to three of those "hard" days a week. These are the "workout" days. The rest are just normal, every day life as usual. My life as a developer living and working from home is *very* sedentary. This is why I think three hours of walking or low effort cardio is no where *near* over-training. It's just normal human activity that my body is *expecting* me to do daily. Keeping the pace slow means that I can get a recording device and take notes, write, and such while on my two-hour daily walk/hikes. ## Monday, August 3, 2020, 12:02:58PM Need to consider how to add bounds limits to the pegn.Parser such that when developers initialize the parser they can set specific sizes for things overall as well as any semantic size and length limitations which can be specified in PEGN which the addition of MinMax and Count. ## Monday, August 3, 2020, 8:40:04AM I've revised the PEGN specification to divide the Definition non-terminal into TokenDef, ClassDef, ParseDef. This semantic distinction will allow better code generation later. For example, TokenDef can be rendered as a single file of constants, both single runes and simple strings (like <- for example). ClassDef can be rendered at simple boolean Is* functions. And the remaining ParseDef definitions will be rendered as files each lower-cased and containing only a single Parse(p Parser) (Node, error) function. The modularity should be expanding and supporting grammars rather easy and allow more precise re-rendering when the PEGN specification changes. My biggest concern is separating out the convenience definitions from the *actual semantic* Nodes that one wants in an AST. For example, Spacing which is (BlankLine / Comment)*. Do I really want a Spacing node with an identifier? Probably not. I *do* want the BlankLine and Comment Nodes in the AST because they are significant and could be used to re-render the code using a format tool, etc. I don't really need the Spacing Node type. It is just a convenience to make the grammar more readable. On the other hand, when thinking about other possible code renders the option of having more granularity in an event-based parser/handler is actually a good thing. For example, say I want to handle EndSpacing() for anything or more specifically EndComment(). I have the option. In fact, the cost of a containing convenience Node like Spacing really is not that bad since it is just 4-6 characters defining the containing Node and its type. So I think I just won't worry about it. If it is in the PEGN, it should always be in the AST. ## Friday, July 31, 2020, 8:34:55PM Had to install Skype as a backup because Discord is so horribly bad. The Discord technical team is rather unprofessional on so many levels. Every time I experience one of their broken live upgrades I wince. I've working in far to formal and fault-intolerant environments before where people die if shit doesn't work. Discord doesn't give a fuck, clearly. ## Friday, July 31, 2020, 6:39:19PM Having a spectacular week. I came across [Cliff Young](https://youtu.be/R276S1KMgQ0). I guess he's the reminder that I needed that humans are *incredible* beings and have enormous potential that is practically imperceptible in today's day-to-day living. I've been fighting the "humans are shits" feeling for a while and the Universe brought Cliff into my path for a reason. I don't have to feel bad that I'm doing the "marathon shuffle" when I hike/walk/run or even that I might feel inclined someday to actually wear "gum boots" someday on a run --- and you *know* I am gonna do it. Today while on a particularly glorious section of wooded trail the noon-day light broke through and illuminated the trail in such spectacular fashion that I actually broke down and cried quite a lot. The leaves where back-lit and illuminated, the smells of the forest overwhelmingly fragrant. Critters all around me scurried and birds sang. As I have frequently experience (and often forgotten) I became one with my surroundings. I feels almost as if I'm *actually* one with everything around me. I sometimes imagine the atoms of my body just releasing the forces that bind them together and being absorbed. These days I'm run/walking 13 miles a day --- every day. My life has been filled with periods of time when I was connected with the outdoors and when I was not. Every time I have lost that connection my life has failed in some way. I feel outstanding exhilaration every day and a desire to make every indoor hour count even more so that I can get back outside. I fit into all my Patagonia gear again so looks like the Winter won't hold me off. Four hours a day outside, every single day, that is one very real secret to happiness. ## Tuesday, July 28, 2020, 4:53:42PM Added detection of /tmp/commitmsg to my save command. Sure makes for more useful messages. I have come a long way from doing WIP messages even for notes log entries. Still most of my notes will just be generic commit messages, but not the others. ## Tuesday, July 28, 2020, 8:42:33AM There's a pattern with me. Once I get outside *a lot* and establish the habit of being outside for more than two hours a day I just don't want to go back to doing *anything* inside. I have to force myself to code at all. It becomes a chore. That's how I know that deep down I'm actually *not* a techie at my core. I'm more of a writer, explorer, and traveller than anything. That also explains my obsession with technologies that enable the exchange of knowledge and storied, that help others become their own gurus so they can learn on their own terms. The idea of creating (or at this point even playing) a video game absolutely sickens me. Given the obsession I had with Witcher 3 and, for a while, with Overwatch, that is really telling. It shows me that our brain chemistries are more in control of us than most of us would care to admit. Right now my brain is literally addicted to the endorphin and dopamine high from trekking 10 miles a day as fast as I can, of taking a road less travelled on occasion just to see where it leads. Of getting lost and being okay with it as I find my way back home. It's no surprise that my body has transformed, almost completely. I'm not a tan/sunburned, long bearded, Patagonia-wearing dude who quietly nods and smiles to passing travelers on the path, all facing their own individual struggles, all oblivious to the pointless traffic and pollution around them. I've been reminded I've always been a member of a different tribe, one with members who wonder silly things like, "can I fit my hammock in my pack and take a nap between two-hour hikes?" ## Monday, July 27, 2020, 7:48:36PM Just hearing about the "bullet" method for note taking. Seems interesting but I have a feeling that digitally thing form of logging might be better for me so long as I have a way to search them. I am partial simply to use the Pandoc Markdown div notation for this stuff: ::: Reminder Need to walk the dog. ::: ## Monday, July 27, 2020, 7:44:44PM Need to look into a book recommended from a friend called [Make It Stick](https://www.amazon.com/Make-Stick-Science-Successful-Learning/dp/0674729013). Wondering if it is new or just rehashing the old stuff we already know. ## Monday, July 27, 2020, 7:17:11PM Need to put an article together that has all the great, free code sharing sites out there. Here's the best so far: * <https://ix.io> * <https://codeshare.io> * <https://repl.it> Those are my favorites --- especially ix.io which I can use from within a vi session to send and receive specific lines and sections. ## Monday, July 27, 2020, 7:08:30PM Found out today that <https://codio.com> is on some Python version previous to Python 3.6 because someone I'm helping cannot use f-strings. This is the *standard* used by his university that costs him \$960 per
course. I'm fucking out of my find with how stupid that is.

##  Monday, July 27, 2020, 4:22:41PM

Had a good reminder today how important it is to keep the Go interfaces
in the *top* level of the package and the mostly private
*implementation* of those interfaces in their own second-level
subdirectories (subpackages). It makes the names work out perfectly.
Instead of ending up with a parser.Parser and node.Node you get
pegn.Parser and pegn.Node. This also allows the removal of the
redundant name in the constructors so instead of parser.NewParser()
and node.NewNode() you get parser.New() and node.New().

To any old-school Java or C++ class-based OOP programmers who might be
reading this. Creating the top-level interfaces of the package feels
like writing the top of a class, and coding the specifics of the
subpackage subdirectory implementation files feels like the body and
methods of a class.

I have coded other Go stuff like this before, but this pegn package
really serves to illustrate how powerful the approach of public
interfaces and a private implementation can be in Go when combined. It
is definitely *not* the intuitive approach that I see beginners use, but
it is very clearly the most idiomatic Go design. I will be so glad to
use pegn, ezmark, dtime, cmdtab and other packages to
demonstrate this. I'm very motivated to get this really polished to show
how objectively simpler and superior Go coding is to Rust.

##  Monday, July 27, 2020, 9:26:35AM

The demand for PEGN parsing solutions will only increase as the tech
world continues to simplify the human-computer interface with text and
conversational interface layers. The demand for highly efficient and
sophisticated language grammars will also increase as will the demand
for software developers who understand them and can create them
properly.

The pegn package I've been building falls directly into that space
seeking to meet the needs of developers who need to implement
domain-specific languages quickly and easily. There's no need for ever
developer to create a new PEG parser every time. There's not even a need
to redefine the most common tokens and classes used in any DSL. They are
all included in PEGN and the pegn implementations.

Very few people on planet Earth will even understand enough to grasp the
impact of something like this. But to those who do, this will be very
clear and appreciated. I just wish I could find more of such people.

##  Sunday, July 26, 2020, 5:03:34PM

Looking over plans for the rest of the year and comparing my offerings
with those of others and it is clear I need to make a few changes.
Here's the summary after talking it over with Doris:

1. Complete and organize opensource projects (PEGN, Ezmark, kn,
RWX.GG, README.World) enough for allow others to contribute and put into
their own portfolios. Recruit help.

1. Simultaneously continue working on porting the original Python
SkilStak challenge set to language-agnostic challenges.

1. Identify language-specific challenges and organize them as well.
Create a challenge summary page for each language that embeds the
language-agnostic ones.

1. Create *Polyglot Programming* Twitch/YouTube video walkthroughs of
the challenges done in simultaneously in Bash, JavaScript, Python, and
Go.

1. Update the SkilStak curriculum to take into account the additions.

1. Formalize the SkilStak waiting list.

1. Raise SkilStak rates minimally after curriculum updates are in place
mostly due to limit of 25 community members maximum.

##  Saturday, July 25, 2020, 9:29:44PM

TIL that <https://cypress.io> is a thing for testing JavaScript. Looks
promising. Need to take a look at it.

##  Saturday, July 25, 2020, 3:51:38PM

Recently I was reminded how easy it can be to accidentally defeat the
entire point of using sustainable interfaces instead of structs ---
especially if you are creating an internal struct that implements the
interface as the same time that you are designing the interface.

Here's a method I had initially created correctly, but didn't catch the
very obvious mistake until later.

go func (n *node) AppendChild(c Node) { if n.c1 == nil {
c.(*node).mum = n  // DOH! n.c1 = c n.cN = c return }
n.cN.AppendAfterSelf(c) } 

Casting the incoming Node c into a *node is a whopping error
because it forces all Nodes past to it to *also* be *node* pointers.
But that goes against the entire point of interfaces. I should be able
to pass *anything that fulfills the Node interface* and obviously
something else might be different.

Here's how I corrected it to work with *any* Node, not just my own
specific implementation:

go func (n *node) AppendChild(c Node) { if n.c1 == nil {
c.SetParent(n) n.SetFirstChild(c) n.SetLastChild(c) return }
n.cN.AppendAfterSelf(c) } 

##  Friday, July 24, 2020, 8:18:26PM

Turns out it is *way* too verbose. To me the tests are the best way for
someone to get acquainted with your code base. If you mess that up with
something like following boilerplate created with gotests you lose the
value of communicating what your package is about and how it works. In
fact, this just makes me want to use Example* more than Test* even
more. Everything becomes a part of the documentation and you are writing
it from the perspective of someone using it.

##  Friday, July 24, 2020, 7:30:01PM

Looks like exists already! I am so seriously blown away by this and it
has been around for a very long time. It even has test template support
with testify templates as an option.

 go get -u github.com/cweill/gotests/... go --all -w node.go 

##  Friday, July 24, 2020, 7:24:21PM

It occurred to me that creating a tool to stub test cases would be
relatively easy just by walking the Go AST and matching the lowercase
file name with the specific interface or struct (assuming you use that
convention and they match).

##  Friday, July 24, 2020, 6:41:57PM

Realized there is real value in having *two* scripts directories in your
path, one for the good stuff in dotfiles that you can show off publicly
to others and another for all the quick and dirty shit that needs to me
made into something better eventually. Here's what they look like in
mine:

sh export PATH=\ SCRIPTS:\ SCRIPT_PRIV:\ ... 

(Of course there is a lot more in that path after that.)

##  Friday, July 24, 2020, 6:27:50PM

I really need to find or make a tool that creates test case boilerplate
code for everything in a matching _test.go file.

After make a note command for like the 10th time I am realizing how
much I really need to fucking finish kn. I would use it ever five
minutes. It *must* be made!

##  Thursday, July 23, 2020, 4:58:59PM

I'm having a *really* hard time with the overwhelming level of laziness
in the mainstream population --- particularly among the highly gifted
who could be doing *so* much more for the world. Don't know why it's
hitting me so hard today. People constantly complain about now having
enough time for this or that when in fact that have *gobs* of time that
they are *choosing* to waste on absolutely stupid shit. At least I will
be able to eventually die with a clean conscious that I worked my
fucking ass off to make the world --- and everyone's lives in it ---
just a little bit better.

##  Tuesday, July 21, 2020, 3:47:47PM

TIL that Golang used reflection to implement the "built in"
unmarshalling. That means no one should ever use it of they actually
care about the performance of their JSON parsing. Instead, I will
*always* implement UnmarshalJSON myself to ensure stuff gets where it
needs to be as fast as possible *without reflection*. Seriously,
reflection is pretty damn close to the devil.

##  Monday, July 20, 2020, 8:37:16AM

As I wrap up the [PEGn.pegn
spec](https://gitlab.com/rwx.gg/pegn/spec/-/blob/master/PEGn.pegn) I
realized how much incredible value this has for helping people learn
different language syntaxes. The power comes from a drop-dead simple
representation of a language syntax that most people can understand
simply by looking at it with no further explanation. By having people
learn language syntax by studying a PEGn grammar they can immediately
apply their own internal syntax checkers when writing one.

This doesn't even touch how easy it is to excite someone into writing
their own language once they see how easy PEG grammars are to write. The
processing of logic in PEG makes for good logical ("computational")
thinking as well.

One of my new goals for the community is to have people write PEGn
grammars for all the major languages. Having a database of PEGn files
for all languages is *such* a huge win in terms of understanding
different languages and being able to create your own.

##  Sunday, July 19, 2020, 6:17:42PM

Getting the questions about why I'm not writing a book. I get that one a
lot. The simple answer is that books take too much time and knowledge
bases are more effective *if I can ever get them updated enough*. The
challenge is getting your knowledge into others view. But the plan for
Knowledge Net sharing and subs will cover that. I just have to fucking
finish it!

##  Sunday, July 19, 2020, 9:34:11AM

PEGn is *really* grabbing my attention. It's becoming that perfect thing
between ABNF, the original PEG (which has several substantial flaws in
the "example" syntax), and the myriad PEG parsing engines out there ---
all of which suck at creating *readable* grammars. PEGn will boast the
following when complete:

* Self-specifying PEGn grammar
* Most readable grammar specs on the planet
* Nearly identical semantics to original PEG "example"
* Semantic capitalization identifier naming conventions
* Full set of reserved classes and tokens
* Zero ambiguity semantics
* Full Unicode support

And eventually I plan on building the following tools for it as well:

* pegn - linter, validator, and code generator
* vim-pegn - vim plugin with PLS language server support

My code generator won't clutter up the grammar itself with inline code
(as cool as that is). Instead it will allow *granular* creation of the
different language renderings. This is substantially better than
*anything* else out there right now because they are all ***language
specific*** which destroys the usefulness and ubiquity of the grammar
itself. No, instead pegn will support *modular* code generation
support allowing different implementations of a rendered parsing even in
the same language. For example, say you want your grammar generated in
interface-centric Go versus struct-centric. Or say you want to generate
code that generates an AST, or other code that is focused entirely on
handling parse events. There are so many different ways to implement a
parser for different needs. The one flaw that *every* PEG-to-code
generator has right now is the inability to adapt to these needs and the
fucking gawd-awful grammar specification files that result.

##  Sunday, July 19, 2020, 7:39:00AM

Been really conflicted about when to use Go interfaces and when to use
structs. I tend to be a one thing or the other kind of guy. Using
interface gets you immense flexibility while structs work better
with marshaling and require *far* less code. I've decide to follow
Goldmarks' lead and create both my parsers leaning on interfaces more
even if that means a few accessors and mutators. I am probably too
abused by Java to look at them rationally. They probably do have good
use sometimes.

##  Saturday, July 18, 2020, 1:43:31PM

Got tinout moved over to <https://gitlab.com/rwx.gg/tinout> and
push-mirroring to GitHub. I've decided *nothing* goes into rwx.gg that
isn't *at least* version 1.0 or higher. I want to have someplace where
people can go and be reasonably sure that stuff will usable.

##  Saturday, July 18, 2020, 11:56:35AM

Having writer's remorse over writing that slam on tags and structs. As
usual the truth is in between them. In fact, I *love*
github.com/ghodss/yaml (and so does the Docker project) for parsing
YAML into structs with the least amount of hassle --- when structs make
sense.

I've been really second guessing my decision to move to interfaces for
all the knowledge package stuff. After all these things *are* just
static data. I'll move to struct approach for the AST from Ezmark before
I make a final decision on the kn stuff.

##  Saturday, July 18, 2020, 11:15:32AM

After facing the quirks of JSON and YAML tagging yet again I went ahead
and wrote [Golang YAML/JSON Tags Actually Suck](/golang-tags-suck/).

##  Friday, July 17, 2020, 6:51:31PM

Cloudflare just went down reportedly because of a "bad router rule" in a
server in Atlanta taking out 1.1.1.1. The number of people depending on
that central DNS provider is proof of how stupid people are. The entire
point of DNS was to allow *distributed* DNS providers rather than have
everyone depend on a service. It really revealed how stupid some
companies are. GitLab was one of them. After seriously fighting with
GitLab's brain-dead flavored Markdown --- despite their claim to be
moving to CommonMark --- I've gathering up reasons to give GitHub
another look. But, honestly, I'm kinda tired of depending on *any*
centralized service at all at this point.

##  Friday, July 17, 2020, 8:13:13AM

Another amazing and unexpected advantage of writing in PEG is that you
can specify ordered priority such that things that are *more likely* to
occur in a language are examined first. This has never been something
any specification language has allowed an author to communicate. It also
brings forward some of the difficulties when a syntax would be easier to
parse *without* the preferred position when examining the input. For
example, Text is far more frequent than Tex. But checking for Tex
lexically is easier because you look for $ and know you have it right away instead of maintaining the priority and checking that $ is *not*
present so that you can continue with the Text parsing. This does
cause a bit of redundancy in the parsing engine because to check for
Text I have to rule out $ and then later have to check for $ to
make sure I have a Tex inline. The cost is easily worth it, though,
given all of the code that would have to be evaluated otherwise leaving
Text as the plain option at the end of the list.

##  Friday, July 17, 2020, 7:52:01AM

I cannot overstate how amazing PEG positive and negative lookahead and
lookbehind are for specifying language grammars. It allows
specifications to directly communicate the code that needs to be written
including some idea of how much memory will be needed for any lookahead
specified by the grammar as well as how many previous states will need
to be saved (memoized) to assert any look behind.

This has been particularly useful when dealing with sets that can
include other sets except for one specific thing. This is impossible to
capture in EBNF or ABNF and requires resorting to "rhetorical"
specification syntax.

Here's an example: Markdown *inlines*. Often one inline can contain most
of the others. In PEG you simple have to do a negative of the inlines
another inline cannot contain (rather than explicitly rewriting every
one of them).

peg Inline <- Text / Quote / Emph / Link / Pre Quote  <- (!Quote
Inline)+ 

##  Thursday, July 16, 2020, 7:53:08PM

Yet another reason not to use Zsh(it). It doesn't even have [variable
name references](https://rwx.gg/advice/dont/zsh/). Zsh is such a script
kiddy toy, just so much evidence of that now. I'm beyond trying to
listen people convince me otherwise. "Run along. I have work to do."

##  Thursday, July 16, 2020, 4:44:12PM

Finally got at least all the main pages on [rwx.gg](https://rwx.gg)
working again and put the old kn shell script back in play. It is so
great for auditing.

##  Thursday, July 16, 2020, 8:19:17AM

Playing around with a new morning routine. Been up since 6am today,
yesterday 5:30. I have been naturally waking up earlier as I just write
off the end of the day and go to bed around 10:30. They say you need
less sleep as you age, but I don't buy into that idea --- especially if
you are still regularly exercising. I've been running for an hour or
more every day now for over a month. It's been absolute bliss. I love
yoga, but running on a good trail has always centered me mentally as
much or more. I'm still planning on daily strenuous yoga asana again
after I get my base health back.

Here's my daily schedule lately:

|Hour|Activity| -|-| 6:30a|Up / Resting Heart Rate / Coffee & Walnuts /
|Code / Crap | 7|*Eat* (Oatmeal, Protein, Coconut Oil, Coffee)| 8|Code /
|Write / Think| 9|Run| 10|*Eat* (Protein and Avacado Toast or Pickle,
|Tomato Sandwich)| 10:30|Stream / Teach| 11|Stream / Teach| 12p|Stream /
|Teach| 1|*Eat* / Relax / Coffee| 2|Code / Write (Live)| 3|Code / Write
|(Live)| 4|*Eat* / Mentor| 5|Mentor| 6|Mentor| 7|*Eat* / Mentor|
|8|Mentor| 9|Walk Dog| 10|*Eat* / Relax| 11:00|Sleep|

My best brain power of the day is --- without a doubt --- in the
morning. It is also when it is the most peaceful around here.

I'm going to do a better job writing this personal stuff down in case it
might help other people heading into old age watching their bodies freak
out in ways they could not have anticipated. Mine happens to be chronic
inflammation for reasons I cannot explain. Here's how I've started to
beat it:

1. Slow-paced running about an hour a day away from people
1. Completely eliminating any refined sugar or processed food
1. Dropping meat and carb-dense food from diet
1. Eating rather small portions of things more often
1. Setting an alarm to eat every three hours or so
1. Focusing on positive things instead of stressful stuff
1. Wearing a mask even in the house during pollen season
1. Taking my Xyzol to keep from reacting to our dog

I don't have Diabetes but my family is a long history of type I and II
so I figure treating myself as if I could have it eventually is just
safe. So far my blood sugar insulin response seems like I could be
subject to it a bit.

I just read on a site run by the Diabetes association that one way to
treat it is to essentially track your food (and your blood glucose,
which I'm not doing to do yet at this point) and eat about 1800 calories
tops (for an average size person) with no high-carbs. It's kind of like
the Keto diet without the huge negative side-effects of Ketosis,
replacing fruit with veggies, fluids, and fat --- glorious, good fats.

In fact, fat is really the secret to a lot of good health (for me). It
is so fucking ironic that one misunderstood study sent the entire world
into an obesity and Diabetes epidemic mostly because everyone eliminated
all fat from their diet.

Fat provides consistent energy and blood sugar without spiking. It
satisfies you so you eat less. Some fats are essential to building brain
cells.

One thing is for sure. Sugar is the *fucking* devil. It feeds Cancer. It
spikes insulin and destroys the Pancreas. It rots your teeth.
Statistically speaking Sugar is more deadly than Cocaine and yet they
are basically the same thing, addictive isolates taken from natural
sources.

##  Thursday, July 16, 2020, 7:46:42AM

I've been cleaning up the sites with the old Bash kn script now that
I'm taking all this luxurious time to actually finish Ezmark. It is
always good to use the prototype again to get a sense of what I was
trying to do in the first place. Keeps me grounded.

One thing that looks ludicrous to me now is adding so much data to the
YAML metadata header. Back then I was convinced it was easier to use the
YAML since it is more structured. But the truth is the YAML should
*always* be about the *meta*. Content specifications can call for
certain header names and structure to the Markdown (whatever flavor).
Anything else probably deserves its own file that can be rendered
inline, which is what the RenderMark approach is all about.

##  Wednesday, July 15, 2020, 4:49:13PM

Great ideas from the stream today about rendering the TexExpr block
and Tex inline as SVGs that are inlined into the HTML rather than
depending on a JavaScript library *at all!*

##  Wednesday, July 15, 2020, 4:27:32PM

Big great discussion about MathJax or KaTeX and what to call the AST
element. Pandoc mistakenly called it Math.

markdown Here is a $\epsilon$ thing.

$$\forall x \in X, \quad \exists y \leq \epsilon$$ 

So that turns into this:

Here is a $\epsilon$ thing.

$$\forall x \in X, \quad \exists y \leq \epsilon$$

##  Wednesday, July 15, 2020, 3:56:02PM

Was reminded that a {{.TOC}} in the template is a good idea and that
TOC content is metadata *not* data that really has no place in the
README.md file itself which is just bothersome to the content
have the TOC heading data in the BASE/json file.

Also decided that Heading attributes really need to be mandatory to
allow Heading text to be changed without impact.

##  Wednesday, July 15, 2020, 8:32:37AM

I love that Go's creators were so fucking experienced that they could
leave goto in Go without shame. It seems like the entire world of
less-than programmers don't get why they made this decision. But if you
truly want to understand a specific case where it makes a *ton* of
difference in efficiency take a look at Go's own syntax parser. Yep,
there's goto in all its glory doing what it was *meant* to do in
spectacular fashion. These are yet more reasons to truly understand why
Go is a *far more thoughtfully designed language than Rust*. Very few
people on planet Earth would even understand an explanation of why that
is objectively true. But it is nice to know they do exist. I am such a
Rob Pike fanboy. There I said it.

##  Tuesday, July 14, 2020, 3:02:22PM

While doing the PEG for KN Ezmark I realized that BlockQuote and Div
are effectively the same thing in the Pandoc AST. Both are containers,
but the Div is *far* superior to maintain and parse. In fact,
BlockQuotes have always been a pain in my ass. I've decided to try and
get away with dropping them entirely from Ezmark. I am sure some people
will scream and they can use other full parsers like Pandoc if they need
them.

This does mean that Div is actually a SemDiv because it is *not*
based on style. It denotes a semantic collection of content within the
current content such as a callout, note, or even an *actual* block
quote. People can use them for addresses and such as well. In fact, it
is the *exact same thing* as a Fenced syntax aware code block, but for
other stuff.

##  Monday, July 13, 2020, 2:44:42PM

The Pandoc AST isn't bad, it's just not what I would do. It's too much,
doesn't match CommonMark and just so damn annoying sometimes. I mean,
Inline.Smallcaps Really?

The biggest problem of all is how much you cannot change it.

Goldmark grew their own AST as well. It's not horrible, just not very
well informed from experience looking at document structure for several
years as I have been, obsessing over this stuff.

None of the document and knowledge solutions have ever *started* with
the AST model and worked up. The closest thing we have to that was the
DOM and we know how that ended up.

Back when I was doing the ABNF for BaseML (later EzMark) one thing
really stood out. None of the existing Markdown formats --- from the
very beginning --- were able to be rendered inline. They all required a
first pass in order to parse *all* the block types and reattach the
reference links and such. That has *always* annoyed me. Each block
should be consumable and rendered immediately after it is read.

I really am having a hard time just shutting up and using Pandoc. There
is so much bloat and overkill to use that method. Pandoc Markdown
doesn't even look CommonMark compliant.

What I really want to do is specify a superset of CommonMark that is
100% Pandoc compliant that can be rendered immediately and mastered very
quickly. I want RenderMark.

##  Monday, July 13, 2020, 5:21:09AM

Up early after a long sleep. Must have been that three hour run/hike on
Saturday. Feels good though. I can tell my old body is returning.

##  Sunday, July 12, 2020, 5:01:04PM

Nothing like getting something great down on "paper" to motivate me to
do everything else to make it a reality.

I love where the design of Knowledge Nodes is going --- particularly the
dynamic Node types that will ultimately display word maps by usage,
dependency trees, 2nd and 3rd level subscription recommendations, and
even some machine learning common usage patterns.

The idea that I will be able to *visually* display my entire RWX
Knowledge Network out to two or three levels is so absolutely awesome
that I want to make it *today*!

It is so hard to describe to people (including my wife) but when they
*finally* see this in action --- especially after we get a few hundred
people on the RWX Knowledge Net --- no one will even really want to use
Google again. Why use Google when I can search *all* the content from
everyone in my *customized knowledge network* instantly and locally?

You won't even need a fucking web browser! Graphical knowledge net
browsers can run *entirely* on SSH connections and fallback to HTTPS
because they don't even need to access the Internet that much, only to
check in. *This* was the vision of RSS that never really came to pass.

Seriously, this has very real potential to change the world in ways I
have always wanted. I hope Aaron would be proud. May this whole effort
succeed, for him.

##  Sunday, July 12, 2020, 8:58:51AM

What was I thinking? Just use pandoc you idiot! (That's me shouting at
me.) Even if did manage to create a PEG Golang parser (which I really
*want* to do because it would be fun) it makes no practical sense when
an entire project of brilliant people are supporting Pandoc *every day*.

So what if pandoc is a subprocess every time I call it. That's nothing
these days. Will speed rival Hugo? No, but it won't be as brain-dead as
Hugo either. Hugo was made by engineers, not academic and scientific
writers. That is why Pandoc has such a strong LaTeX influence and why R
make Pandoc its official documentation language.

If and when I hit a performance issue there are plenty of other ways to

* Create a daemon that automatically builds anything when changed. I
need this anyway to preview the rendering in pseudo-real-time.
* Cache AST parsing and word crunching so that it just has to be
recombined in a make like way.

Plus if I make this all about Pandoc I might even get some buy-in from
the Pandoc community itself, since Knowledge Net will be fundamentally
dependent on it. But hey, the WorldWideWeb was created by academics to
share information so why not?

##  Saturday, July 11, 2020, 8:38:29AM

Woke with an idea to build the Pandoc AST into the core RWX Knowledge
Net specification. In fact, I can use Pandoc in two stages to prepare
for its eventual replacement as a parser and renderer, first to generate
the JSON AST (loaded into a Go struct) and then to render by piping the
JSON AST into the pandoc app for rendering. It will make the initial
tool need double the calls to pandoc, which I was trying to reduce,
but for a *very* good reason. Later I can keep pandoc as a renderer
(since people will always want a fully configurable Pandoc rendering
option) but I will replace pandoc as a parser with a native Go PEG
parser.

##  Friday, July 10, 2020, 9:23:42PM

The more I dig deep into the Goldmark Golang Markdown library the more I
realize how bad it is, mostly because I have seen a very good AST
already (Pandoc). The author of Goldmark probably doesn't even know
Pandoc exists. So much for my Goldmark fascination. I sometimes hate
that I can actually look under the hood and see the shit for what it is
a lot of the time. Just the crappy autoidentifier code was enough to
throw it out for something that is actually specified (Pandoc). Every
quirk that Pandoc has is nothing compared to hack that Goldmark is.
Problem is, all the rest are even worse --- especially Blackfriday.

##  Friday, July 10, 2020, 12:06:31PM

That vim-pandoc plugin gets so many things wrong. I'm seriously
annoyed even though I should be grateful it exists in the first place. I
definitely need to make my own vim-datadoc plugin from scratch but
built on the ideas in vim-pandoc.

The official name for the syntax of all RWX Knowledge Nodes will be RWX
DataDoc Markdown. The emphasis is equally on the YAML data, not the
limited Pandoc Markdown of the document which is good since the YAML
data is *never* optional in a DataDoc.

Again, I have a lot of editing to do.

##  Friday, July 10, 2020, 11:53:26AM

This running has resulted in an explosion of brain power. I forgot how
much a slow recovery walk after a good 20-30 minute just hyper-charges
my brain with oxygen and great ideas --- even a new personal mantra
that's fun to chant to myself while running.

##  Friday, July 10, 2020, 9:49:45AM

I'm really struggling with a good alternative to Pandoc divs (:::). I
have really come to love them in my writing but the more I think about
them the more I realize pretty much everything I ever put into one
deserves its own knowledge node instead and a link. I have this problem
because I'm such a parenthetical writer. I am always by-the-way-ing this
or that instead of just focusing on the immediate topic. At least with a
link people can follow the tangents if they really want.

I'm pretty sure serious publications handle this with the square bracket
syntax that refers to other articles.

$[Here's yet another story about this.]$

I like that approach the best I think. It is well-established and looks
pretty in my Vim editor because the brackets still force the text to be
highlighted even though they are escaped out with backslash.

Plus when parsing it all I have to look for

Looks like I have a *lot* of rewriting to do. The good news is that I
will have a locked down writing style standard that I and anyone else
can follow if they want.

##  Friday, July 10, 2020, 9:39:28AM

Sometimes flexibility is the fucking devil. For example, allowing the
RWX Knowledge Net (yeah, that's the latest name I've been using) to be
anything *but* YAML and limited Pandoc Markdown is a huge disadvantage.

We can really see this with Hugo. It supports no less than three
"front-matter" (God I hate that term) structured data formats. What a
fucking nightmare! I am so glad to be free of that shitty design.

If there is one thing the last three decades of Web technology have
taught us it is that flexibility kills adoption and inhibits sharing.
This is why we still have YAML 1.2 spec from like more than *10 years
ago*. Smart designers do *not* want flexibility when it comes to things
like structured data formats. They want consistency.

So the RWX KN spec *only* allows a single YAML header and *optional*
body of limited Pandoc Markdown. Both are very solid standards that have
been in use for years.

##  Friday, July 10, 2020, 8:53:58AM

Back to masturbatory note taking. The idea that I could whip up a blog
for every one of these random, spontaneous thoughts was just stupid.
Notes are the organic stuff that bubbles up and needs to be caught right
away before it fades. Then an article (or *blog*) can be made from that.

By the way, I fucking hate the word *blog*. It has lost all specificity.
Usually when people say "blog" they mean "piece" or "article" or
"chapter." The term is dead to me. I will silently look at others who
use it and quietly mock them inside. It's my prerogative.

No, instead I hereby asset the following moniker for this random brain
dumping. From now and henceforth it shall be called --- wait for it ---
*note taking*. I know. The level of my brilliance cannot be grasped.
\*tongue firmly in cheek\*

##  Thursday, June 18, 2020, 6:46:47PM

I seriously can't get anything done because I keep having better ideas
before implementing the previous idea. It is getting really annoying.

The latest comes from really looking into the different forms of
knowledge node and realizing that if I can have an entire node just for
a concept or term definition I certainly can have one for a blog post.
This addresses another concern I was starting to have for letting users
order their blog entries the way they want when reading them as well as
being able to link to specific blog entries without having to load up
the whole weeks worth of stuff.

I've also been changing recently to putting some sort of title on the
blog to give a summary of it, which means I might as well title the
things.

The real question is what to make the URLs. Some say that the text in
the URL gives it a better score. I don't really care. I just want a
short, time-based URL. Hummm ...

 2020/wk24/some 

Yeah that's the ticket.

##  Thursday, June 18, 2020, 5:35:45PM

My entire life I have struggled with seeing dumb shit and not having an
appropriate outlet for it. I thought the 'Shit' page here could help
with that. Then I experienced *another* racist prick who uses that
approach specifically to appeal to a *specific* type of person and I'm
like, "Oh shit." Yeah. It became clear that my venting could easily be
taken up by that same type of person.

Words are powerful and no matter how raw and frustrated I get from being
engaged with the world I *cannot* let it make me behave in ways that I
so deeply loathe in others. I am just a hypocrite at that point.

I'm not saying I'm going the other way and becoming a snowflake afraid
to disagree with anyone for fear of offending them simply by
contradicting them. A good *debate* is needed, not a personal,
shit-flinging fight.

##  Thursday, June 18, 2020, 4:55:40PM

Reminded today that some people are just such assholes and sometimes I'm
gonna get asked about them. Some YouTubers are obviously misogynistic,
racist, bigoted pricks who make fun of autistic people. After a bit of
personal frustration with the existence of these people and some support
from my wife and community I've recommitted to not get mad, just get
busy. I have to drown out these dumb-asses not just because of their
tech preferences but their influence on particularly young people.

Lucky I have one big advantage. I can form relatively persuasive
sentences that form paragraphs. Most of these assholes spend all their
time making lame videos and pumping up their echo chamber forums rather
than creating anything of substance. But then again, the world largely
doesn't *want* to read any more. Idiocracy is real.

##  Thursday, June 18, 2020, 9:12:07AM

The more I toy with Rust the more I realize that a languages strengths
are often disassociated with its syntax --- but not always.

The stupid implementation of generics in Go to save a few compilation
milliseconds at the cost of syntax clarity *forever* was *not* worth it.
Just shows where the *current* Go teams priorities are.

The Rust syntax is dense, but not difficult once you understand what is
going on. It reminds me of Perl so much it's uncanny. I chuckle a bit
that so many people are Rust lovers while at the same time hating on
Perl for its "read-only" complicated syntax. Rust syntax is *far more
dense* than Perl's for most applications.

Still I don't mind. God knows complicated syntaxes have never thwarted
me in the past. I have to say that the amount of *raw*, on-the-metal
power Rust provides with the promise and hope of a world full of *safe*
code is overwhelming, but I've been burned by languages making lofty
promises before. \*cough, Java\*

##  Wednesday, June 17, 2020, 5:17:41PM

The CTF games <https://overthewire.org> and <https://picoctf.com> have
reminded me overwhelmingly that the secret to having fun learning is
gamification. I've always done it, but I sometimes move away from it a
bit as I get all teacher-y and start explaining stuff.

Well I'm here to report that returning to challenge-based learning is a
*massive* success. Every private mentored session starts with me giving
a challenge in terms of the specific outcome and not telling the person
*anything* about how to accomplish the challenge, only the key terms and
words they need to understand and research on their own to do it. The
reaction has been *overwhelming* --- so much that I really need to add a
point system to the whole thing eventually so people can have me check
them out and give them the points somehow, if not just encourage them to
keep track of them all somehow.

There are *so* many things that need to go into this new knowledge app.
I have to keep at it.

##  Wednesday, June 17, 2020, 4:16:32PM

After doing a few small challenge projects in Rust I'm finding how much
I *really* love Rust, not for the syntax, but for the absolute freedom
and seriously low-level of the language. It is *far* easier to integrate
C and C++ code with Rust than Go, and safer to do so. There are number
of other things bumping around in my head that are really annoying me
more than I care to admit about Go:

* **Go is unsafe by default.** You can relatively easily right broken,
unsafe concurrent Go code and likely won't even realize you are doing
so.

* **Go contexts are horrible.** I hate them. They are poorly conceived
and implemented. They feel so hack-ish to me.

* **Generics are being crammed in.** People are arguing about the
generic(some) syntax vs generic<some> and I have to admit the
contrarian position of avoiding the syntax for generics that has existed
forever in other languages is just plain stupid and annoying.

* **Go has OOP inheritance stink on it.** I actually read something
about the "diamond patterns" that can happen with "embedding" (code for
inheriting) other structures. It should surprise me since there was a
Java guy on the original design team for Go (who recently left). Rust
has more functional influence and smells a lot more like Haskell, which
is *so* much better.

* **Pike and original creators seem no longer engaged.** They rarely are
involved in the mailing list and having any direct input to the Go
project. It really feels like they are letting in languish and the
design decisions are much less informed of late. People on the list
reported problems with 1.14 but it got released anyway.

* **Brian Cantrill likes Rust, not Go.** I know this is a small thing
but I *really* respect his level of knowledge about languages and if he
has a problem with Go then I really have a problem as well.

* **Rust is industries ["best
chance"](https://thenewstack.io/microsoft-rust-is-the-industrys-best-chance-at-safe-systems-programming/)
for safe programming.** That's a pretty big statement from Microsoft of
all companies. If there is one thing the world needs, it is *less*
insecure code. That is almost entirely reason enough to get behind Rust
*no matter what*. The rise in cybersecurity needs will only get worse
and Rust is the safest systems development language on the planet, bar
none ('cept maybe Ada someone said) .

##  Wednesday, June 17, 2020, 9:13:35AM

I've turned on syntax highlighting by default in kn because the blank
line fix addresses it. This will result in substantially larger
knowledge base renders than with it turned off but will allow people to
provide their own loadable syntax highlighting. It is just a matter of
time before the default knowledge base template contains the JavaScript
to tap and pick from a number of different syntax highlight options that
the *reader* can set based on their preference. I cannot wait, actually,
that is something I have been wanting to do ever since I set all the
standard colors of the rendered Web version as CSS variables that can
very easily be changed and stored in the browser or even downloaded as a
JSON file.

As much as I want to do other things eventually, this work on kn and
the README.world is *so* important that it *must* take priority. When
fully realized the world will have an easy way to capture, maintain,
share, and consume knowledge according to the preferences of the
*reader*. Taken to the n-th degree this could become the foundation for
all knowledge, even proprietary knowledge that needs to be sold. I have
not encountered any other solution to this problem that incorporates so
much of what is already best practices.

By the way, I cannot wait to integrate OpenPGP signatures as well. Every
time I see a star by someone's name in Twitter indicating that Twitter
has graciously decided to confirm for the world that a person is who
they say they are I feel a little kick in the butt to get signing
working. This is a problem that already has a solution --- that no one
actually uses because they *think* it is too hard.

##  Wednesday, June 17, 2020, 8:30:42AM

I'm super annoyed that Firefox decided to actually make whitespace a
node that screws up pre-formatted blocks. There is simply no way around
it without a horrible JavaScript fix that does not work for text-based
browsing.

I did find a fix for Markdown documents that I hate but will have to
start following. Always put the fence posts in a code fence on their own
lines and deal with the extra blank lines always at the beginning and
ending, which is easy enough through some style changes.

This goes for Pandoc div fences as well:

~~~markdown

Do this:

js

console.log('hello')



And this:

No seriously.

~~~

That makes it more readable and searchable anyway so I’m fine with that convention — especially since it gets entirely out of the way when doing optional syntax highlighting.

## Wednesday, June 17, 2020, 7:59:27AM

Even though Linus even came out and said 80 characters is too limiting for line lengths today, and the fact that I have been pushing for one-big-long-line paragraphs in Markdown paragraphs to play better with panes that might be less than 80 characters I have adopted a 72 maximum line width convention for my own YAML knowledge base data, including the Summary and paragraph forms of data.

Here’s the thing. Once you sign off on putting your data into YAML instead of Markdown you are accepting that white space matters — particularly initial white space since it is a fundamental of YAML syntax, structure and readability. Such is not that case with Markdown. Having a super long text value wrap and break up the clean whitespace lines to the left is simply so ugly and hard to process that I couldn’t take it.

Luckily Vim and other editors will automatically do this whitespace indentation for you. And since Python is still so popular with people in the world they will be fine with it.

This works particularly well for a new knowledge node format I’ve called ParaList which has a distinct topical sentence T and a paragraph body P. I can see if the topical sentence is getting to long just by seeing if it wraps.

yaml — Title: Don’t Be a Lazy Learner Subtitle: Identifying an Anti-Autodidact Quote: This is all just too much talking. Query: true

Type: ParaList

Summary: Sometimes understanding how to become one thing means first understanding what that thing is not. Here’s a list of characteristics and behaviors you will find in an anti-autodidact, a lazy learner who would rather make excuses than do the real work to learn on their own.

ParaList:

• T: They are bored and confused by words in general. P: The more words, the more they check out, spoken or written. Most have a ridiculously low vocabulary and don’t ever let on how many of the words they don’t understand in any given conversation.

• T: They don’t have the capacity or motivation to figure things out. P: They say shit like, “I don’t get it” or “It’s not working” or “I did what you said” or “The stupid computer won’t…” or “What is happening?” They ask a lot of questions during movies.

• T: They want to be in the same space with people who do learn. P: They are uncomfortable working from home. They wander into your cubical at work a lot. The feel safer in your presence because they don’t on their own. They feel they need you close to learn, even on their own. —


Typically such format constructs will have the topical sentence be first
and all bold while still remaining a part of the paragraph.

##  Wednesday, June 17, 2020, 7:29:45AM

YAML is the API of knowledge. I don't think I fully recognized the
genius that is behind the *full* and much derided version of YAML until
I started thinking of *all* my knowledge in terms that would fit into
one of a few self-made knowledge organization types. Turns out there are
only a few constructs we use to capture our knowledge and all of them
can be easily categorized as knowledge format types. I've been building
out the list made a few blog posts back and as I've been converting my
[RWX.GG]{.spy} knowledge base over I've been *loving* it in ways I'm
quite sure very few others will appreciate. Doesn't matter. I know Aaron
Swartz would approve. His is the only opinion I would work to win over.

One dilemma I am having, however, is how much to dismiss the idea that
all knowledge should be able to be written by *anyone*. I still maintain
that Pandoc Basic Markdown is the world's most universal and simple
format for knowledge without losing sustainability. But putting more of
my knowledge into the YAML section instead of the Markdown section could
be see as a move *away* from easy to attain technologies.

Basic YAML is *so* easy to learn and maintain and read that I feel like
this move is consistent with my primary goal of knowledge format
simplicity. If anything it might help people categorize their knowledge
as they capture it by pre-meditatively working from a few common
organizational formats that don't have anything to do with a heavy
syntax such as HTML --- or worse -- XML or the semantic web
specification.

One thing is for sure. Even if I do move my knowledge formats into a
form that requires learning YAML as well as Markdown and it puts it out
of reach from someone who just knows basic Markdown I feel I need to do
it anyway for much the same reason that a Mathematician would use LaTeX
within Pandoc to describe their knowledge. Some knowledge must be more
structured than simply a whole bunch of Markdown.

##  Wednesday, June 17, 2020, 6:54:41AM

Yesterday I had a lazy learner during the live session say, "This is
just all talk". The topic was "Get Linux" which requires an *enormous*
amount of discussion to get right. I made a big deal explaining how
important it is to assess the needs of the person to whom you are
recommending a Linux distro before just blurting out, "Manjaro is the
best!"

People who make recommendations without considering the needs of the
people they are recommending too are irresponsible egomaniacs motivated
by something other than the welfare of those they claim to be helping.

Lazy learners have a few things in common:

* They are bored and confused by words in general. The more words, the
more they check out, spoken or written. Most have a ridiculously low
vocabulary and don't ever let on how many of the words they don't
understand in any given conversation.

* They say shit like, "I don't get it" or "It's not working" or "I did
what you said" or "The stupid computer won't..." because, in fact, they
don't have the capacity or motivation to even try a little to figure out
what is happening.

* They generally want to be in the same space with you. They are
uncomfortable working from home. They wander into your cubical at work a
lot. The feel safer in your presence because they don't on their own.
They feel they need you close to learn, even on their own.

* They love mythology and Marvel movies. They don't have to think. The
don't even have to imagine. There are fewer words to confuse them. They
just plugin and go for a ride. This is also why they hate reading,
role-play, and sandbox video games.

* They are prone to religion and tutorials, which in many ways are the
same thing, a recipe for how to do something step-by-step without having
to think about *why* you are doing something. Just do what some says and
you'll have your app or your key to Heaven. It's the same appeal.
Working out how to make your own app or discovering modern ethical
behavior are hard, too hard for a lazy learner. They want to be told
what to do.

* They ask lazy questions like, "Which distro do you use?" or "Which
window manager is best?" or "What certificate do I need?" or "What
college should I go to?" Again, they would rather have a recipe than
analyze anything at all.

* They tend to be cheaters because they don't value learning at all.
They are more focused on what they can *get* than what they can learn.
They would hate Kant's categorical imperative if they even knew what
that meant.

* They generally lack self-awareness and avoid introspection with
distraction and urgency addiction. When you ask them what they want to
do with their lives --- or even the current day --- they struggle to
answer. Asking them what they are good at usually results in long
pauses.

* They are either super depressed or overwhelmingly over confident.

##  Wednesday, June 17, 2020, 6:38:53AM

While helping people during private mentoring I'm becoming more aware of
just how difficult it is to get a handle on one's own knowledge
management. In fact, I'm now convinced it just might be the single
biggest thing blocking people from doing their own learning, from
becoming a *true* autodidact. Without being able to *write down* what it
is that you are learning and learn from your own written self
assessments that you can search for later you just cannot learn
effectively. In lay terms, you cannot learn without taking good notes
--- especially if your notes are also your main text book.

This is a bit frustrating because the number of people who simply don't
even know how to form a sentence --- let along type it quickly and
coherently into a codebook --- is far too high. I think this was part of
the reason I was so frustrated later. I realized that my formula for
learning might simply not be as universal as I'd hoped because of this
*major* prerequisite. If you cannot read, write, and type you cannot
become an effective autodidact --- let alone create your own exercises
and execute your own activities and self-assessment. Learning to learn
is actually way harder than I have thought it to be, but like so many
things, because I surround myself with those who have already learned it
on their own I've had the impression that its not difficult at all.
Having them around me has confirmed a false bias and conclusion that
learning is easy, that it is innate, that it does not have to be taught.
It does.

In short, I need to have a boost for each of the RWX elements, reading,
writing, and executing. I also probably have to have one for basic
computer algebra.

##  Tuesday, June 16, 2020, 5:58:37PM

I have had a lot of success moving to a more data model approach to the
knowledge bases. It feels a lot like what was intended behind the
semantic web but more focused on localized, even personal, knowledge
instead. If there is one thing I continue to obsess over it is
organizing, capturing, and sharing knowledge in a digestible, searchable
way.

The latest breakthrough was adding a Challenge category with the
Prereqs pointing to other HowTo type knowledge nodes. The Challenge
nodes are of type HowTo as well.

##  Tuesday, June 16, 2020, 5:35:46PM

Don't know if it is the dreary day outside but feeling a little
depressed. I have all the reasons in the world to be happy. Let's face
it. I'm a sun worshiper. I always knew it.

I think I'm battling feelings of being under-appreciated a bit as well.
I know that people value what I'm putting into the world --- for free
--- but I do sometimes wonder if they realize just how valuable the
information I am putting out there is. Yesterday I reviewed an
absolutely horrendous book that is \37 at Amazon. I fell for it. I bought it. It absolutely sucks. But they have my money no less. Meanwhile I continue to endeavour to put out high quality accurate, modern content that is backed with objective experience and research only to have to deal with the occasional downvote from a dolt who doesn't even know what the word dolt means. I am seriously saddened by the overwhelming level of stupidity rising all over the world. All I can do is help to fight it with every breath. But it is the incessant attack on my optimism that eventually gets the best of me and makes me just want to do the minimum to get by and languish in escapism for the rest of it. So many people *live* in their escapism --- especially now. I need more coffee. ## Tuesday, June 16, 2020, 4:23:03PM I read a dumb comment in a StackExchange post that said, "Why would you ever learn shell? No one ever gets a job with that." I can't believe that level of cluelessness of that statement. The thing that people think isn't "going to get them a job" is actually the most magical way of making the 10x better than anyone else qualified for that job --- especially if you are going into anything in cybersecurity or systems operations. ## Monday, June 15, 2020, 9:43:09AM Looks like [Google's shell scripting guide](https://google.github.io/styleguide/shellguide.html) agrees with my conclusions on pretty much everything. It doesn't just validate my conclusions on Bash but everything I keep saying about all the shitty advice out there --- particularly on YouTube. One thing that convinced me to go back to two-space indent was Google's guide. The waste of space has started grating on me --- especially now that YAML is being affected. I would rather have more levels of modern nesting than a default that wastes space. Also discovered shellcheck after reading the Google guide. This isn't the kind of utility that I would have exposure to in my previous POSIX-only life. It is an absolute *must* for beginners and will definitely be the first thing I introduce, along with bat for test-driven shell development. I did discover that there is a readonly keyword which is the same as declare -r which is far more useful when declaring global constants. I also ran into an interesting quirk that prevents the use of any readonly variable name *anywhere* else in the code. It throws an error even when using declare within a function scope. I confirmed again that declare is *exactly* the same as local but learned that the local keyword accepts *all* of the same options that declare does. Therefore I will begin using Google's convention of local only for stuff within functions and declare only for global stuff at the top and readonly for constants. While there is no technical requirement to do so, this distinction makes for much more readable code. ## Sunday, June 14, 2020, 1:21:44PM Everyone knows how much I love Go. But something interesting has been happening since fully adopting Bash concurrency and learning a bit of Rust. Rust and Bash are squeezing out my personal need for Go. Let me explain. In the old days you had shell and C. That was it. C was even called a "high-level" language by Kernighan at the time. Today I'm finding the equivalent to be Bash and Rust. Go replaces Python. But the space occupied by Python is also becoming smaller, at least for me. I'm not doing a lot of cross-platform general purpose and machine learning programming. I'm not even doing a lot of terminal UI program development or back-end web services development. These are Go's sweet spots. Bash has *completely* taken over almost everything I would ever need for Python and Perl (and I never did use Node, thank God). In fact, now that I am more fluent with coproc and & for backgrounding processes efficiently Bash has taken over a *large* part of what I would grab Go for before. I also have been realizing just how bad Go is for beginners *when specifically learning concurrency*. I used to think Go was great for beginners because of its simplicity. But that is exactly why it is dangerous. Go does not protect beginners in any way from writing unsafe, concurrent code. In fact, it is even *more* likely because it is so easy to write for beginners to write concurrent code with goroutines. Beginners never learn the safeties built into the compiler to check for race conditions and such. It is *never* covered in any material --- if you can even find *any* up to date material at all. That's bad. Meanwhile, despite the complexity of Rust for concurrency and general syntax that I've been railing on because it is *so* hard for beginners to get their heads around, when a beginner finally does learn concurrency in Rust they get safety automatically. They have to forcibly break the safeties of concurrency already in place. In fact, Rust is a much safer language all around to learn, *far* more safe than Go even though Rust syntax is *wildly* more complex than Go's. Rust clearly replaces C and C++ for Unix philosophy compliant commands. Rust is a *very* good candidate to rewrite all the old busted boomer GNU code. Go isn't. In fact, it is a [wish](/wishes/) of mine that someone would write a 100% compatible Bash shell clone under a permissive license in Rust. Then there's the what-would-I-require approach. What if I had to run a company and my very life depended on the team I hire to be able to produce *safe*, robust code quickly that would remain completely sustainable over time? Would I want to hire a bunch of Go programmers who possibly came from the Node and Python community? Or would I want the 10x developers who *understand* Rust and why it is so significant, like Brian Cantrill. Picking Rust means rather than having to vet a potential Go programmer by asking all kinds of questions about compiling with the profiler and race condition checks I can simply ask to see an example of a candidate's concurrent Rust code instead. It is *much* harder to identify a Go programmer experienced with safe concurrent programming than a Rust programmer of equal skill. Clearly if my life depended on it I would want Rust developers over *all* of the others even if I had to spend 10x the effort to find them. If I made a bad hire I'd still happy with the fact that it is nearly impossible to write unsafe code in Rust. In other words, Rust covers me on two fronts: (a) only really well informed and naturally good programmers even learn Rust, and (b) beginning Rust programmer are *more* likely to produce good, strong code that will stand the test of time. No wonder it seems like all the good Rust developer jobs are in Germany. Rust makes it easier to filter out those who just aren't safe programmers. If my life and company depended on it, the *value proposition* of the Rust language is much more compelling. It just depends on if Rust truly delivers on the promise of *safe* concurrency. That's where I need to focus my research. What I'm saying, I think, is that even though I've barely coded anything of any significance in Rust I now see why the counter-intuitive notion that Rust is better for beginners *despite* its complexity might, in fact, be objectively true. This compels me to fully master Rust by writing a few very significant projects in it --- specially things that benefit from Rust's strengths: small run-time, no garbage collection, memory safety, PEG <https://pest.rs> library, and raw speed. Therefore, my plan is to entirely complete and grow kn into the primary utility and keep it in Bash. Then I will supplement it modularly by providing additional commands such as a Pandoc Light Parser in Rust can be be used independently in the Unix philosophy integrated *into* kn through regular command calls, just like I do with pandoc now. ## Sunday, June 14, 2020, 1:01:33PM Here are my needs when it comes my most common needs for tools and languages: 1. Stuff to help me automate and simplify life on the command line 1. Stuff to create highly efficient, Unix-philosophy compiled commands 1. Stuff to make front-end web sites and apps 1. Stuff to make back-end web services I imagine myself looking into the toolbox when I'm about to work on something. * If I need to duck-tape a bunch of stuff together to make something that just works I grab Bash. * If I need to create a highly optimized command that can be duct-taped I use Go, Rust or C. * If I need a front-end web site or app I grab Markdown, HTML, CSS, JavaScript, Pandoc, and Vue. * If I need a back-end service I grab Docker, Go, Rust, Nginx, GraphQL, PostgreSQL. When it comes to getting shit done there is nothing faster than Bash. I have objectively demonstrated that to myself over and over again. The twitch tool, my kn utility, and my repo GitHub *and* GitLab tools have been in Bash and will *never* be rewritten in any other language. There just is no need. Now that I've discovered coproc and started using & to background processes more I simply do not need a complicated strictly typed language to achieve maximum concurrency as fast as possible --- both in terms of execution *and* development speed. ## Sunday, June 14, 2020, 12:45:17PM So I have to admit Rust syntax and style is really bork, but I'm definitely okay with it. It did cause me to reconsider my fetish for 2-space tabs that I took to some two years back after doing a lot of JavaScript development. Now [Linus Torvaldz and everyone else agrees 80 characters is too limiting given modern terminal widths](https://duck.com/lite?kae=t&q=Linus Torvaldz and everyone else agrees 80 characters is too limiting given modern terminal widths) all the arguments for 2-spaces really fall apart --- especially since four spaces has been the standard for most languages for more than two decades. It is one of the few things that Python, PHP, Perl and shell coders generally all agree on. Plus it is just so much more readable --- especially when disabling syntax highlighting of any kind. So I have been slowly but surely changing all my code and writing to include the new 4-space standard. It's amazing how much something so seemingly trivial can produce *so* much work. I feel like I just can't get my feet under me lately with so much going on. When I do run into old stuff, like my first knowledge base back in 2013 I realize how much progress I have made. The Knowledge Management Utility (kn) that I have been working with during all of this is *so* ideal because it just gets so much done so quickly. There is seriously nothing faster than prototyping in Bash. Bash *blows away* my best productivity with Python, Perl, Go and any other language I've used in the past. It is really such a shame that more people *don't understand this fact* because they haven't been exposed to it. But I'm doing everything in my power to change that and shoot our collective productivity forward for those with the good sense to truly consider and understand why. Just like any age, most people will continue to simply no understand, but it warms my heart to encounter someone line [\@gamozo](https://twitch.tv/gamozo) and be like, "Woah, this dude *knows* what I'm talking about." In fact, I find that people in the pentesting and security field almost automatically understand where I'm coming from. That is enough consolation for me since the %37 annual increase in demand for the fastest tech career in the world is specifically full of those people. We are all just hackers at heart even if we are doing the tools engineering and not the pentesting. All the other Java and C++ developers can just play with their toys and leave us alone. I won't say it is a cliche, but its a cliche. You either get it and are one of the cool kids or you don't and frankly don't belong. No need to be mean, just realistic. A *lot* of people just don't have what it takes to truly master the terminal including Bash scripting an prototyping. Most don't have the commitment and ability to even master touch typing, let alone terminal skills. The world will always be filled with such people. They aren't *bad* people or *stupid* people. They are just not hacker material and never will be. The end. ## Sunday, June 14, 2020, 10:44:55AM Today I opened up my Basic Markdown video to find a "dislike" for no apparent reason. Humanity has invented very few things as fucking stupid as "likes" and "dislikes". The Black Mirror episode about it doesn't even come close to capturing it. Someone can be a complete fucking moron and simply dislike something voting it down. Then another dumb ass can upvote something else. *You have no idea about the spineless cowards who refuse to take two minute to justify their position.* The *only* reason like and dislikes exist is so that the service providing them can suck in more shallow participants, ad revenue, and money. In other words, once again marketing and ad revenue are *destroying* authenticity and intelligent dialog. ## Saturday, June 13, 2020, 11:34:18AM Feels good getting all those Types captured. Now time to rip apart the entire "studio" and reorganize the house. It's nice to have the weekend "off" but I've just filled it up again with other shit to do. ## Saturday, June 13, 2020, 10:23:23AM The list of knowledge node Types is shaping up nicely. It's evolved a bit from earlier versions which did not allow nodes to be contained within others. This directory organization --- along with the constraints on node directory IDs --- has made for several improvements that were not possible before. ------------- -------------------------------------------------------------------------------------- Paragraph Just a single paragraph which Pandoc Mardown emphasis. Unordered Simple, one-level list containing markdown that will be rendered by looping through and loading the template partial specified for the node. By default each item in the list is a simple unordered list item with a bullet. Ordered Simple, numbered, one-level list containing markdown that will be rendered by looping through and loading the template partial specified for the node. By default each item in the list is a simple numbered list item. Numbered means numbered, not lettered. Blockquote Same as Pandoc Markdown. Verbatim Same as Pandoc Markdown. Code File contains only code in a given language. Article (Default) Akin to a "page", "document", magazine article, blog post, definition, encyclopedia entry and such. Spec A specification can be as informal as a challenge to test skills doing different programming tasks or as formal as a corporate project RFC. Stories contains the user stories, sentences or paragraphs describing examples of how stuff will work. Stories are usually no more than three lines of text. More formal specs can include Reqs lists to fulfill the full [INVEST](https://scrumandkanban.co.uk/what-makes-a-good-user-story/) characteristics of a good spec. Think of how a highly skilled team of professional hackers would outline an op, or what corporate client might want built into an application. Both are Specs with Stories and Reqs describing the *outcome* at each step along the way to success. HowTo Step by step, walkthrough of a how to execute a specific or generic task often outlines in a Spec node. Each step must have a summary of the step, followed by a detailed explanation and demonstration of how to do it. Use header levels to group and implicitely number the steps. There are no special YAML traits. Usually a single HowTo contains several Prereqs to help avoid maintaining an excessively hyperlinked body. Adding the Spoilers boolean will progressively reveal all headings and paragraphs incrementally. Log Cronological listing where each second level header is a time stamp the format of which is defined for the knowledge base or individually in the YAML for a given node. Order is irrelevant since that is a matter of sorting for the consumer of the data, which can easily be altered through DOM manipulation with JavaScript, for example. Collection A list of maps specified in a Schema header using YAML !! notation for the basic types float, str, list, map, bool, timestamp, set, binary, etc. ParaList Similar to a Bulleted list but each item is a simple paragraph with an initial topical sentence that is usually bolded. NumParaList Same as ParaList but numbered. Numbered means numbered, not lettered. Table Simple table where Fields are specified and each record in the list is also a list. Generally a Collection is preferred for its superior ability to capture complex data structures. Image Single image. Video§ Single video reference. Usually video will not be stored locally. If so, the rendered node will contain an embedded player for the video. If the URL is external *and* Internet access is detected the video should be embedded using the default video partial or one for the specific node if available. A thumbnail.png matching one full frame of the video resolution should always be included. Any additional Markdown can be included in the body of the node to annotate the video. Nothing should refer to any specific video service using minutes and seconds notation instead to allow the viewer to find the location in the video themselves. Can be combined with other node types and will render differently depending on the combined type. When combined, the Titles are identical. Audio§ Essentially the same as video but without a thumbnail.png. Can be combined with other node types and will render differently depending on the combined type. When combined, the Titles are identical. ImageMap§ An image with clickable regions that link to internal nodes and external resources. Rendered as a map in HTML. Diagram§ Any large image of any type that can be displayed inline as well as providing a link to a file containing the diagram. Usually contains annotations as well. Slides Follows the specific Pandoc slides format conventions. Chart§ Any of a number of chart format types. A fallback chart.{png,jpg,svg} must be provided but rendering formats can progressively replace it with interactive chart renderings of any kind. Quiz Simple YAML list of Qs and As where the questions are single paragraphs of Markdown of reasonable size, which can contain images, and the answers are regular expressions for every possible accepted answer. Renderers can progressively allow the answers typed in and validated in real time. There will never be multiple choice of any kind, ever. Just a list of human-readable acceptable Answers that match the regex that can be revealed with a click or tap on mobile devices. *Outline* An automatically generated node that recusively examines all the nodes listed by ID in the Outline property and creates a *plain* indented outline by title only. *Bookmarks* An automatically generated node that reads the Titles of all the listed nodes in the Bookmarks property and links them to their IDs and URLs. Can include both knowledge nodes and external Web URLs. *Catalog* An automatically generated node that reads *all* YAML properties from all of the listed nodes within Catalog making the data within them available to the template renderer. Rendering depends completely on the target rendering format. *Redirect* A simple redirect to another page. The Redirect points to the new location. These are picked up by renders and collected into whatever form of redirection functionality is available for the rendered format. For example, an HTML renderer gathers them into a _redirects file. *Aggregate* An aggregate of knowledge nodes into a single node containing all the content from the aggregate nodes in the order listed in Aggregate. Useful for composing books, articles, courses, and such from existing nodes. Nodes listed in Aggregate may be local or remote. If remote a full URI is required and the remote node must fully comply with knowledge node specifications, which are mostly just to have a README.md file contained within the URI and that only local dependencies marked with ./ will be pulled over. *Latest* An automatically generated node that recursively examines all the nodes within itself for those with the latest changes and displays them. Ignores and other criteria are passed to the Latest property. ------------- -------------------------------------------------------------------------------------- * *Italics* indicates an node that is mostly automatically generated node. * § indicates a node that can be combined with any other node that is not *italicised* or also marked with §. I'm really glad to see the notion of optionally inlined header includes again. That was a part of the original Essential Web design in 2014. ## Saturday, June 13, 2020, 10:05:42AM Feeling really triggered by something as simple as an unnecessarily complicated URL like those on the completely brain-dead documentation site <https://readthedocs.org>. Here's an example, the URL to their "Getting Started" page: <https://docs.readthedocs.io/en/stable/intro/getting-started-with-sphinx.html> Yeah right, like anyone is every going to commit that to memory. How fucking stupid do you have to be to build a system that results in URLs that span more than a screen of text?! You know what this *should* have been? <https://en.docs.tech/intro/start> But that is too much to expect from a bunch of Python-loving, Sphinx pushing, non-writers. Perhaps they thing that URL will do more for their SEO (like I once did). It doesn't. Sphinx is *so fucking stupid!* Oh my God, I'm so triggered. You can tell engineers made it and not anyone who actually has to write for a living. You know how stupid their designs are just by noticing their focus on Python and reStructuredText. Hell, they don't even use standard Markdown nor even seem to know what Pandoc is. This triggers me because it is --- once again --- an example of people designing things *without even imagining how the shit is going to be used*. People *completely* put out of their mind the person using what they are designing. Don't get mad, get busy, Rob. Don't get mad, get busy. What we need is not yet another application or site, we need some standards and best practices for the content *itself* first and then any application can be layered on top of that. ## Wednesday, June 10, 2020, 7:17:01PM I'm revisiting the language and terms I use a lot lately. One term that really holds up is *challenge*. People really respond to it. It's essentially the same thing as an *exercise* but invokes a lot more motivation and fun. Another place that we see challenge-based learning *really* skyrocketing is in the CTF and security space. Essentially every "wargame" *is* a challenge. The only difference is that there isn't something to find, no prize to unlock or discover. But maybe there should be? The word *challenge* also can replace the word *project* where a specification is written out and the challenge is complete when the project matches the specifications. After considering the boring term *howto* there really is no comparison. The *howto* is simply the *walkthrough*, the *recipe*, the *solution*. I'm trying to reduce the complexity of terms and number of menus on [RWX.GG](https://rwx.gg){.spy} and think I've got them now: -------------- ------------------------------------------------------------------------- Welcome How to join our community and what we are about, how to connect. Questions Dos, don'ts, who, what, when, where, why? Boosts Combinations of Tools, Challenges, Definitions, Articles, and Tools Challenges Projects small and large, howto, walkthroughs. Latest Automated recent created, updated, or modified content. Also news. -------------- ------------------------------------------------------------------------- Each of these is actually just an index. Boosts are the trickiest to categorize. ## Wednesday, June 10, 2020, 6:52:35PM Made the mistake of clicking on my stream stats. My high point was 125 subs. I'm down to 67. It actually just makes me laugh. It shows how fickle people are and how much they think that somehow by *buying* a book or subbing to a streamer that all their magical learning problems will go away and they'll be able to suddenly get great jobs as coders or hackers. So it's back to streaming as usual. Eventually I'll add the schedule back. But I'm not ready yet. I'm on the verge of committing to never do a video destined for YouTube until I have the entire write up complete so that I can read from it and people can follow along, no more summaries and overviews. I *really* do love the people who have been tuning in and trying their best --- especially those expressing their support for the herculean effort it takes to make all of this available. That is why when I stopped the boost. People were dropping like flies. They couldn't keep up. I said from the beginning it would require an *eight hour per day* commitment. But that was just too much. I *knew* it going into it, but even the fastest among the group couldn't keep up. These are smart and dedicated people most of who have *no idea* how to learn and take responsibility for their own learning. The answer is the same as it always was: modular content that they can consume when they want and need at their own pace *and a community there to help them with motivation and answers.* That is what I will continue to provide. But I do laugh because as usual, the *right* way to do something is the *opposite* of the way that gets the most attention and money. I never did the boost to get any money. I *always* wanted a way to get feedback on preparation of materials for everyone. It's a good thing too, because as soon as I made the change everyone just stopped coming. Then to make matters "worse" (if my priorities were actually on my success as a streamer according to Twitch statistics) I went and started OverTheShoulder streaming again which is when I don't respond at all to anyone in the chat and just live stream what I'm writing and doing. Only a hearty few really care for such things --- especially when it involves just thinking through things like knowledge management and not showing everyone how to hack or make millions in the latest fad language. I suppose the thing that saddens me the most is how frequently I encounter people unwilling --- or worse --- unable to simply communicate in written form at all. So much of the value I provide is by writing shit down that no one yet has and working on sustainable ways of keeping it up and maintained. Sometimes it seems that only a very few actually *really* appreciate that side of the work. ## Wednesday, June 10, 2020, 6:10:33PM The last few days the news has been filled with the gory scene of a 75-year-old activist doing nothing but *standing* being pushed to his brain-death as he fell on his head and bled out on the sidewalk in front of his cities town hall. Our shit-for-brains, legally certified insane non-president said essentially that it was his fault for being there and a bunch of other Republicans agreed. This is why I don't watch this shit very much despite how important it is. Watching that video is bad, but watching the unapologetic cops that did this be released to *cheers from a full crowd of cops and other supporters* is enough to seriously activate *anyone* with a heart. Empathy is dead. ## Wednesday, June 10, 2020, 4:19:50PM So hard going through knowledge base migrations. People want the old content because they don't know something new is available and sometimes the new stuff has bugs and broken links. I'd go so far as to say that maintaining a knowledge base is *way* more difficult than maintaining any open source project. ## Wednesday, June 10, 2020, 2:52:11PM The fundamental to all the knowledge issue is the *inability to audit the knowledge*. If it was code or some type of system you would call it "technical debt" but we don't have the same concept when it comes to our knowledge systems. I think the fix might have something to do with *purposefully* breaking things, meaning that if something does *not* get an update it *automatically* disappears thus *forcing* content maintainers to receive notifications that stuff is *explicitly* breaking. What are the things that can be audited in a knowledge base? * Are there any broken hyperlinks? * How old the content is? Has it expired? * Have any dependencies expired? * Regular spelling and grammar things. "Information is relevant as long as THIS is true. 'Great phone to get.' --> 'This phone is top 5 best phone.'" "Deep learning" depends on "Machine learning" ## Wednesday, June 10, 2020, 9:51:20AM One of the simplest questions that has driven me nearly insane is what *specifically* to call what O'Reilly starting calling "recipes" and others call "HowTos" and even "tutorials". That last one is so *phenomenally* inaccurate yet popular that it actually makes my blood boil more than I'd like to admit. I have blogged about these definitions so much it all starts to seem to glaze over. So here's another blog post *about the same thing*. Thankfully we actually have some solid terms for this stuff in the technical realm, its just that muggles have no idea what they mean, without some explanation: Term Definition ------------ --------------------------------------------------------------------------------------- Operation Something done with parameters without specifying how it is accomplished. Method The *way* an operation is completed. Procedure Same as method but used more commonly when involving people. Function Takes optional input and returns output with no side-effects. Also job with people. Task Sometimes operation, others method or procedure. Usually time-bound. Skill The capability to successfully execute a specific task. Ability A capability that is generally more *innate* and much harder to learn. Superpower. Job All over the map. A specific task underway. Also person's occupation. Occupation What a person does with their time usually to make money and fill a social need. HowTo Technically a method for people. Often associated with task. Requires skill. Knowledge The stuff that can be contained in human brain including skills and abilities. You can see that there is a lot overlap between the terms --- some of it very confusing. The review has been helpful, however. It seems clear that the term I should use on RWX.gg and in other knowledge bases is howto because it speaks so clearly to *most* people. "Oh, it's 'how to' do something. Okay. \*click\*" It also makes the IDs read like English sentences like (/tools/ssh/howto/catpub). In a pseudo-OOP form for the human "class" it might be Human.ssh.catpub() and if my specific instance of Human called the method rwxrob.ssh.catpub(). I won't lie. Imagining what code would look like that described real-world procedures being done by instances of humans makes me smile. I'm just that weird. Later when we find out that we actually *are* just computers at some level it will be even more entertaining. But I seriously doubt any algorithms in the natural world have ever been *coded* as opposed to *discovered* through millions of small improvements as we see with machine learning today, (which reminds me, I need to make time on one of my now free weekends to crack open the calculus of machine learning and really sink into it.) There's another interesting fact that emerges. Human brains can be discussed and approached just like any other device that can be programmed. It's just that the *method* of programming a human brain is radically different than programming modern digital computers. This fact can inform decisions about how to organize knowledge and design methods of learning. It is the reason that *knowledge source code* is an accurate description of any stored knowledge that a human brain can consume, the most fundamental of which is text. The one thing that the 90s OOP insanity got right was the *clear* distinction between *operations* and *methods* much like Pascal and SQL got *functions* and *procedures* right. Thankfully it is just lazy people that call these things by their incorrect names today because the languages allow them to be referred to correctly in Bash, Python, and JavaScript. You don't *have* to use the brain-dead "function" keywords in those languages. I mean, ***IT'S NOT A FUCKING FUNCTION!*** Okay, calm down. Rust and Go jumped on to the stupid-train by forcing the use of fn and func for things that having *nothing* to do with functions. That was sheer laziness by the creators of the language syntax. Perl has sub which is ideal because, after all, that is what most of those things actually are. But God help you if you use the term *subroutine* today. "OKAY BOOMER," is the response even though it is so much more accurate its actually sad. All these people who don't know the difference are actually the problem. They cannot decide if their code in this block right here is actually a function or a subroutine/procedure and *that* is what is causing the problem. Who am I kidding? No one gives a shit. The massive amounts of absolute crap code that hacker are walking through and that is killing people in their cars still isn't enough to make people care. Besides, the people being affected aren't the coders writing it. Its the poor souls being affected by it after the coder has cashed out and moved on to the next easy gig. That trend is only going to get worse as more and more people don't even bother to vet the code at all. ## Tuesday, June 9, 2020, 9:45:12PM Realizing that I have to have a solid look at the data model for RWX before getting too much further into adding content to it. I'm up to some 200 or so nodes and the organization is starting to weigh a bit like it always does when I do a refactor. This time, however, it is rather obvious that YAML and partials are the answer. There are a few specifics I have to work out for the common YAML meta data fields. I have a solid Category and Type grouping defined from before all this happened. But there are other state properties that need to come into play in order to allow them to be displayed in indices without a problem but also without providing *any* broken links. Here's what I'm thinking so far: -------------- -------------------------------------------------------------------------------------- Field Description -------------- -------------------------------------------------------------------------------------- Title Most important and 100% relevant even if rhetorical. Query Set to anything (true) to activate an external search query hyperlink for the Title. Subtitle Snarky or clarifying addition. Not necessarily relevant. More rhetorical. Category Specifies the *topic* of content. (ex: boost, course, howto, review) Type Specifies the *format* of content. (ex: article (default), table, log, revlog, walkthrough, schedule, toc, index, list) Summary Only requirement is that it be a single paragraph/line. Should cover everything from reading the whole thing including premise and conclusion with as much justification for the conclusion as possible. Reading the whole thing should be for those who want the details. This is the TLDR but better. In many cases the Summary might be all that is needed --- particularly for definitions. Created Optional date and time in ISO (without the T) when the original content was created. Particularly useful for articles. Updated Last date and time in ISO of any official update or errata fix. Different than the Created time stamp. Planned A rhetorical or ISO date or time span when this particilar node can be expected to be complete and available for full reading and use. This field allows the framing of content in an organized way without creating broken links. While having a Summary available along with Planned is preferred it is not required. ExpiresOn Exact date on which this content will definitely have expired if not at least checked for relevance. If omitted let the auditing tool decide on a default age based on the Updated, Created or last modified dates. PreReqs Stuff that is *good to know* and *strongly recommended* before consuming this node. This node cannot exist without the other prerequisite nodes also existing and being up to date. Critical+ relationship. Not only will this node be omitted if a prereq is not available, but the prereqs should be consumed *before* this node. DependsOn Creates a composition relationship with the node or external content referenced meaning that if the dependency changes or is removed that this node should be made immediately defunct until the dependency can be resolved or a new dependency can be established. This means, for example, that if a dependency returns a 404 during a link scan that the node is *omitted* entirely until the link is resolved. Critical relationship. SeeAlso Set to a list of any Markdown string containing local or exteral links, references, or even just words referring to things the reader should look into. *Be generous in adding entries --- especially those that objectively and rationally take an opposing position to conclusions in the content --- in order to encourage examining a bredth of input and data on the topic. Loose relationship. Deprecated A text field explaining why the node is *old* enough to be ignored but *leave* it so that others understand that it *is* deprecated in case other external nodes have linked to it. Contributors The Name and site ID of anyone who has contributed to this specific node. The ID is also the canonical link to /contrib/<ID>/ which any contributor can add and update for themselves at any time with a merge request. The Contributors of the root node (.) are those who have contributed to more than 20 specific nodes in such a way that being listed as a main or default contributor makes more sense. Video The full URI to the accompanying video be it on the Web or elsewhere. If this is set any indeces created will have a television emoji 📺 that links directly to the video so as to make more sense than is possible through any sort of video service playlist organization even if such is in place as well. Audio Same as Video but Audio. If Video can be consumed simply as Audio as well then just use Video instead. This distinction is more about the help application that will be used than the format itself. However, whenever possible *all* content should be consumable *entirely* as written text with *everything* else being secondary unless absolutely required to convey the knowledge (review of music or video, for example). Always take a progressive approach to knowledge. First words, then *supplement* the words with images, audio, and video. In other words, prepare *everything* as if for someone with visual and hearing impairements. Aethsethic appeal is fine so long as it does not put the content out of reach for *anyone* unnecessarily. TODO Stuff that remains to be done on this node. However, never leave a node with significant stuff on the TODO list *unless* you have set Planned as well to indicate that work is, in fact, in planned for this node. -------------- -------------------------------------------------------------------------------------- Well that was more than I was planning on. But looks good. I'll need to move it into the /contrib/guide/metadata/ at some point. Feels good to have it done though because now I don't have to change the *format* and organization *yet again*. Having a more structured YAML-centric knowledge base is definitely the way to go. One thing that occurs to me writing up the Planned explanation is that I can frame the content that needs to be written and allow very specific contributions from others in the community who may be willing to flesh out the rest of the node, which prompts me to add a Contributors list to each node. I did notice some redundancy between hyperlinks in the Summary and entries in SeeAlso but I suppose that is fine since both fit on the same screen when editing. Generally I think SeeAlso will tend to have more in it since it will include opposing sources as well eventually. Categories are going to be a little tricky, but probably worth it. ---------- ----------------------------------------------------------------------------- Category Description ---------- ----------------------------------------------------------------------------- boost Just to get beginners going. Not necessarily remedial content, but content that leaves a lot for the person to figure out on their own. Even something like the Join Us welcome node is a boost because it doesn't get into the particulars of how to join Discord and such. If it did it would be a walkthrough instead. If it where just a bunch of marketing speak tying to convince people to join then it might be an article instead. (By the way, that stuff usually goes in the root/cover node.) ---------- ----------------------------------------------------------------------------- Because of the nature of Pandoc templates there also needs to be a bunch of mechanical properties that have nothing to do with the meta-data itself. Since these are all related *specifically* to pandoc template generation (or whatever other rendering format or rules) they *must* begin with an underscore (_). ------------------ ----------------------------------------------------------------------- Tag Description ------------------ ----------------------------------------------------------------------- _boost, Same as Category but required for simple Pandoc boolean templates. _article, etc. ------------------ ----------------------------------------------------------------------- ## Tuesday, June 9, 2020, 9:44:32AM I'm really glad I took some time to really dive into the new partials in Pandoc 2.9.2. It is rather obvious at this point that the Pandoc team is moving toward structured data documents with an emphasis on YAML. It's amazing the conclusions like-minded people tend to make when they are all working on the same problems because they are things that *they* need. This was the *exact* reason that I created Hugonot and FADB back in the day. I was obsessed with TOML and thought that it was the best structured data format for such things. Clearly the right format is YAML. I especially believe that now that I understand YAML linking and types which allow YAML to move deeply into the database space, not just configuration files. People hate on YAML for having significant whitespace and normally I would agree, but this *isn't* a programming language and readability and stylistic expression --- without sacrificing consistency --- is really the dominant priority. Most of the time when I encounter a YAML hater they have not even tried to use it once --- typical. So I am facing a real dilemma but I think I know the answer already. I just have to write it down to process it. It might be more informative to recount how I arrived at it. The first few knowledge bases I created for skilstak.io, the Essential Web, the Knowledge Net and SOIL were all focused on content contained in Markdown. At first I forced myself to use only [Basic Markdown](https://rwx.gg/lang/md/basic/) but even before that I was using something from 2009 and didn't even allow long single lines as paragraphs. I would never have allowed "meta data" into the Markdown and thought that mixing the two was a violation of separation of concerns. I abhorred Jekyll for "front-matter" and having forever ruined the cleanliness of *pure* Markdown. A lot of people don't know that having YAML --- or anything --- at the front of a Markdown file was *never* original Markdown and still is *not* a part of CommonMark even though Pandoc and others wisely allowed it. I now have enough experience to realize keeping the metadata with the Markdown is preferred because it allows the file to be migrated rather easily. In fact, YAML specifically allows other body text to come after the YAML section. So in a very real way Pandoc Markdown files are, in fact, YAML files with Markdown in them, which brings me to the main point. Knowledge bases are *at least* half structured YAML data, and half free form written content. But the more I factor out knowledge into automata that can be aggregated in any number of ways --- in order to remove redundancies --- the more I realize even a traditional concept like chapters gives way let alone the whole notion of a book. Instead, indexes that aggregate titles, links, and summaries make way more sense. Taken to its full form you get dynamically composable documents that either an author *or reader* can organize for themselves in a cafeteria style approach to knowledge consumption. This isn't hyperlinking but definitely builds on the idea. I abandoned self-populating dynamic documents with the essential web. But I find myself doing about the same things now as I create utilities for aggregating the knowledge. The simplest form is an index. A bit more and you get a long index with summaries. But it isn't much further to load the whole knowledge node, statically or dynamically. This means that books, guides, and such can be *compiled* in different ways much like source code. I am shaking my head at all the *massive* failures to accomplish similar things, ReadTheDocs, GitBook, Wikis, and more. None of them successfully understood and acted on the fundamental concept of factoring knowledge into the most finite form and then aggregating them back *without regard for any particular target format or rendering*. All of them assume something about the final output. I'm more that a little annoyed that humans haven't figured this out already and created something to do this instead of having to create it. But I take solace in the fact that the very brilliant people on the Pandoc project seem to have arrived more at that conclusion than any others with Pandoc 2.8 and partials. This allows the finite knowledge form to be a YAML document with text fields that *include* Pandoc Markdown. Conclusion? ***YAML all the knowledge.*** Seriously, JGM and the gang are just so fucking amazing on so many levels. They bring practicality to everything they do rather than just creating yet another over-engineered piece of shit like so many knowledge solutions out there are that came from engineers (like Restructured Text, Gatsby, Hugo, even Jekyll). The Pandoc team has *authors* as their priority and *never* forgets who is using the product. The others seem to always target *coders* instead, which is why they continue to fail over and over again. ## Wednesday, June 3, 2020, 7:54:48PM My God the clarity! The more things change the more they stay the same. The recent personal revelations regarding rwx.gg have overwhelmingly confirmed that my gut was *always* right --- we're talkin' from the beginning in 2013. Way back then I was motivated internally to help people learn the Linux terminal command line. But not only to learn it, to *start* with it and *master* it. I now fully realize why that was in a way that I can put words around. Linux --- and specifically the command line --- is the best tool a curious, obsessed, technical autodidact can ever have at their disposal. It is like the ultimate sandbox in which to play, experiment, learn and create. No wonder Raspberry Pi Linux devices are *so* huge and have become standard educational tools in places where education is *actually* understood. Hell, even England has Linux and Raspis included standard in their curriculum. (Not our fucking disastrous dumpster fire of a country 'Merica over here.) The point is. I was drawn to the Linux terminal for all the reasons I wanted to share it: to provide the same learning opportunities for others. Now I've come full circle back to eliminating *everything* but learning how to learn followed by learning the Linux command line and to code in Bash. *All* the rest builds from there. No more futzing around with JavaScript even though it is valuable. The terminal gives *way* more opportunities for far more projects. ## Wednesday, June 3, 2020, 4:24:07PM June 1st mark the end of the first *Linux Beginner Boost* experiment. I decided to cut it short because people are clearly falling behind and cannot keep up with the flipped, 8-hour a day approach. I tend to always be optimistic about people's ability to commit. I'm not particularly sad about it. In fact, after touching base with a few confidentially it is consistent with what they are feeling is needed. One good comment was from someone who had adopted such an autodidactic approach to that whole thing that he became obsessed about learning C immediately. He got the book I had reviewed and started on it immediately. He was feeling a bit of guilt for not tuning into the 3-hour long videos every day because *he was spending that time coding instead*. When I heard that comment I was immediately sold on my *original* focus, which was *primarily* on motivating and helping others master the skills of an autodidact. All of the rest is just curating the best resources for people to pick and work through on their own as well as creating my own resources in the form of knowledge apps and videos. The result is a hyper-topic focus during each segment that then is created into a YouTube video. This allows the videos to be processed in asynchronous time, as has always been an advantage of videos. But it also allows those particularly interested in a given topic to join me live and chime in, asking questions, or whatever. This continue to promote the community aspect of live-streaming that strongly distinguishes it from regular YouTube videos. This has meant yet another reorganization of the [RWX.GG](https://rwx.gg){.spy} site. ## Sunday, May 24, 2020, 12:00:11PM Discovered [Kant's categorical imperative](https://duck.com/lite?q=Kant's categorical imperative) today thanks to [[@green_jenny](https://twitch.tv/green_jenny)](https://twitch.tv/green_jenny). "Act only according to that maxim whereby you can, at the same time, will that it should become a universal law." ## Sunday, May 17, 2020, 9:34:46AM <http://linuxclass.heinz.cmu.edu/> Just found this course from Carnegie Melon University randomly since it is the second hit from the [The Linux Command Line](https://rwx.gg/books/tlcl/) searchable title. Check out the [first lecture](http://linuxclass.heinz.cmu.edu/Linux-Syllabus-S20A4.odt). I can't lie. I laughed my ass off, then I cried a little at the exorbitant costs of these *remedial* Linux courses. I hope colleges get the collective kick in the ass that they need by this whole COVID thing. College priorities have *long* been other things besides the best education of their students. ## Saturday, May 16, 2020, 5:57:12PM As I have been working on <https://rwx.gg> and adding more content to it I realized it would be great if there was a setting where users could simply click on the title to take them to an DuckDuckGo.com search. It wasn't to hard to implement, but the question of whether to use the *lite* version or not came up. In Lynx it looks beautiful, Chrome not so much. I decided to write some JavaScript that upgrades the URL to the non-lite version in Chrome. It only runs when a browser has full JavaScript support. It then occurred to me that the foundational principle of *progressive* web apps is that they *progressively* take advantage of the capabilities of the browser technology available. Then it occurred to me what a *massive* fail our modern Web is based on this foundational principle. All this unnecessarily broken React shit is going to implode, I swear to God. At least my personal observance of the *progressive principle* is beyond even what most are considering in the graphic web browser world. We live in a world of absolutely clueless asshole developers who both do not care about this principle and couldn't even implement it properly if they did. At least I can do something about it myself (and help others not be for *fucking* clueless). ## Saturday, May 16, 2020, 4:14:50PM Just had a panic attack because suddenly Discord decided to echo back everyone's voice to the point where it was really distracting. Tomorrow is Coffee Talk and I was stressing out. I think it has something to do with the reboot the other day and possibly the fact that I had to change the input back to the mixer board (auto-picked the camera mic). Turns out all that had to be fixed was a "Reset Voice Settings" in Discord. ## Saturday, May 16, 2020, 1:22:51PM Just talking casually with my wife about all this and got a rather nice comment about my mentoring ability on the YouTube channel strongly asking for more. My superpower is curating, communicating, and mentoring, the rest is all secondary including the fact that I can code, hack, and build systems. There is a character called Phillip in the *Travelers* show that I've been watching who is like a walking historian and computer. The team lead, McClairen, who is a self-proclaimed "languages guy" besides being a natural communicator and leader. I find myself relating to both of them a lot. When I crack open the 1200 page text book "bible" of *Algorithms* all I can think is, "why the fuck do I care?" Maybe that is why I switched from Computer Science as my major after my first semester when I discovered the humanities and language labs. Ironically the computer technology in the languages lab was *far* superior to that in the computer science lab. Why? Because it was *relevant* and being used. The CS dweebs would laugh at the Macs there, but people were getting serious shit done with them everyday, not debating the merits of C v.s. Pascal as I would hear the lab attendants doing while suffering through CS101 bubble-sort Pascal programming and taking pencil-written tests for coding. In fact, (I'm just gonna say it) I fucking *hate* infinitely removed computer science shit. No one cares until it provides something the world needs. Do we need people with that level of computer science understanding? Hell yes! But what we need even more is people *using* these amazing tools to produce software, information, and systems that will push humanity forward. In fact, the amount of strictly trained CS people the world needs is really not that great. But the world is starving for people with practical knowledge who will push it forward. God knows there is enough work for thousands of people just to counter the fucking idiocy of the JS-JS-JS crowd who would destroy the best information sharing standards the world has ever known. The world needs more people with common sense and the courage to speak truth to power and idiocy and the skills to counter the horrible shit other morons are making. That's my priority, not how to create the best possible linked list. ## Saturday, May 16, 2020, 9:21:00AM I keep coming back to the conclusion that I just need to do one thing *really* well (more than a bunch of things sort of good). That thing is getting more people started earlier by training them to train themselves and become tech autodidacts as soon as possible. I imagine myself sort of at the starting line pushing as many people as possible into tech fields with a *solid* foundation in Linux terminal skills and coding. I can't run far with them all if I want to be there to boost others. The single most important thing I can do is get as many people as I can started out right because no one else is doing it *at all*. The amount of bad information and content on the Internet is staggering. If I do nothing but fight against that for the rest of my life I'm sure I would still be behind. This simply means I have no time for data structures and algorithms at anything but a *very* fundamental level, a level that remains of practical use while providing a solid foundation for those wishing to dive deeper at a formal university level. ## Saturday, May 16, 2020, 9:11:03AM I honestly cannot decide what is the best contribution I can make over the next year so I'm reflecting on a few things: * I'm a good communicator and apparently I motivate people. * I'm a decent programmer and system administrator, but not spectacular. * I'm a practical person. * I'm comparatively slow at everything, but thorough. * I produce high-quality stuff, when I actually finish. * I believe in simplicity. * I feel like the fastest path productivity is paramount. * I'm absolutely obsessed with the efficiencies of the terminal. * I recognized a dire need for *good* content covering terminal and system administrator *practical* knowledge, not what it takes to get hired. * I have no desire to learn data structures and algorithms because they have no *practical* value when it comes to producing value most of the time. * I'm obsessed with Linux. * I'm obsessed with shell scripting good-enough tools quickly. * I'm obsessed with automation. * I'm obsessed with revolutionizing education and credentialing. I suppose considering the needs of the world is important as well to see where they align: * The world is imploding because its systems are broken. * The world's systems are broken because people are not engaged. * The world needs to be less distracted. * The world needs more people producing real value and change. * The world needs more protectors and engineers. * The world is at war, a massive cyberwar that goes unseen. ## Wednesday, April 1, 2020, 4:15:31PM Well doesn't [this](https://pastebin.com/v3K0we9d) smell like onions. ;) ## Monday, March 30, 2020, 12:30:03PM Have a look at Dave Abrams talk on POP as it relates to OOP. Also need to look for [OOP operating systems](https://en.wikipedia.org/wiki/Object-oriented_operating_system). Still need to discover if any substantial operating systems have survived that were written in pure OOP. Also some [problems with OOP](http://doc.cat-v.org/programming/bad_properties_of_OO) written in a very formal way. ## Monday, March 30, 2020, 11:25:36AM Finally! What looks like a half-decent [Golang tutorial](https://www.tutorialspoint.com/go/index.htm). Doesn't look like it covers go mod though (but none do, not even the books). ## Monday, March 30, 2020, 10:25:38AM Need to look into Olive for video editing on Linux. And apparently Blender is useful for video editing as well these days. [tryhard-mode](rwxrob.mp4) Came across the idea of having terminal/video themes based on the day. Beginner day might be pleasant and with blue while TryHard day might have red or tmatrix or something so that people can distinguish them easily in the VODs. Also need to look into YouTube-DL for downloading videos without need for using the service. Also works for Twitch. ## Monday, March 30, 2020, 10:16:33AM While demonstrating how to deploy a site from GitLab to Netlify I realized that the rw tool really needs to have a CI/CD mode so it can be used from Netlify and applied to published Pandoc Markdown so that the HTML is generated on Netlify and not stored in the Git repo itself. That would allow people to keep Markdown notes in GitLab or GitHub using the online editor there without *any* need to run the renderer locally. It comes at the cost, however, of performance and would not be very sustainable for very large knowledge nodes. ## Monday, March 30, 2020, 10:01:54AM Here's a case study from a group that ultimately realized capturing, managing, sharing, and maintaining knowledge as Markdown source was a [good idea](https://caitiem.com/2020/03/29/design-docs-markdown-and-git/). How to do run a TMUX within a TMUX: 1. start new tmux session 2. split pane 3. unset TMUX in pane 2 # this allows tmux in tmux 4. start new tmux session in pane 5. repeat 1-3 6. run tmux attach -t <target-session> # this is opens the shared session ## Sunday, March 22, 2020, 11:19:58PM Having fun with people on OverTheWire.org:  lsof -p PID lsof -n | grep pts htop # discover echo -e "\e]11;rgb:ff/00/ff\007" # color echo -e "\e(0" # gibberish echo -e "\e[?1003h" # gibberish echo -e "\e[<Start>;<End>r" # scroll region reset # also reset  ## Sunday, March 22, 2020, 4:13:59PM Blows me away how well training for cybersecurity is for teaching people to think and research becoming truly powerful autodidacts. ## Sunday, March 22, 2020, 12:29:27PM Tips for mentoring online: • No more than 2 hours per stream without 1 hour break. ## Sunday, March 22, 2020, 10:57:10AM Alternatives to Discord to look into: ————– - - - - —————————- Name C S V P Notes ————– - - - - —————————- Discord x x x x Zoom x x x x Jabber x x x o Mumble x x o o “People have moved on from” Jitsi x x x o TeamSpeak x x o x MatterMost x x x o Matrix/Riot x x x o Apache Licensed ————– - - - - —————————- • Mozilla has moved from IRC to Matrix * ## Friday, March 20, 2020, 2:39:44PM TIL that /msg rwxrob /raid lastmiles would raid lastmiles on Twitch after the timeout lapsed. ## Friday, March 20, 2020, 10:06:31AM After we made the name rwx.gg for the group of us working on capturing the list of requirements for open credentials we had fun with what the gg could stand for: • good game • git gud • grep god • gnarly gnome • gnu guru • go gypsy “It’s like the knowledge equivalent of ‘right to repair’?” ## Thursday, March 19, 2020, 6:54:51PM TIL that https://prerender.io is actually be used by absolutely stupid people who care nothing for blind people and those using text-based browsers. I have never been more motivated to destroy a site that right now. These people have no idea about the amount of damage they are causing the Web and frankly they don’t care. This is not some idle rant from an aging dinosaur wanting to use a text-based browser. This is objectively destroying our ability as humans to search, archive, and access what we have come to love. There has never been a time that the README World Exchange is more needed. Then the world of intelligent humans can be free from the stupidity of those who would destroy the Web’s basic purpose for the sake of whatever JavaScript grants them. ## Thursday, March 19, 2020, 4:27:03PM Just learning about KnowBots today. They are bots that collect knowledge based on specified criteria and have been around a long time. I really want to make know.sh into a service where people can configured their own KnowBots. ## Tuesday, March 17, 2020, 10:48:14PM PicoCTF has an amazing shell interface that can be used from anywhere there is web access. However, some of the “challenges” require use of the Unity game plugin. I don’t have to say how much I disapprove of that. Also the challenges are not in any particular order and have no notion of advancement and building on previous skills (like OverTheWire.org). Concluded the interface beyond that shell is just too kindergarten to stand for one more second. They actually have you enter data into their graphic UI, which is required. They have a great shell and then never use it to submit any flags which all have to be submitted through their graphic user interface (web page) no matter how simple they are. I seriously disappointed because they setup the shell so perfectly and then fucked it up by requiring Unity and a React web GUI to even submit stuff. ## Tuesday, March 17, 2020, 5:14:26PM I’m seriously disturbed by how completely behind most web development is today. The Odin Project doesn’t even discuss JAMstack or development focused on APIs including GraphQL. I want to get involved to help them because I love everything else about their project, but I just can’t given the amount of time to get involved. I would rather provide input to a big project like that instead of building all my own stuff though. ## Monday, March 16, 2020, 9:44:00PM Wow the weeks are flying by. So glad I have the number of weeks in the directory name for the blog week. So with the Corona virus shutting all organized gatherings down, like completely, this has been a good chance to test everything over Discord sessions. So far Discord turns out to actually be better than in person. Some people still feel more comfortable learning in person if they they are objectively better learners over Discord. Talk about conditioning. The school system has everyone thinking you have to be at a place to learn. The hard reality is that all of these people are being conditioned to not learn without the crutch that is the system. It’s actually hurting them. ## Sunday, March 15, 2020, 1:08:37PM So streaming has overall been a blast, but there is one serious down-side. A certain minority think they are experts on everything “you” should do. Yet they don’t have anything of their own. “You need experience” yet they don’t have a single stream or YouTube video to draw from to back up their authoritative bullshit. Thank God for the ban function, which reminds me I need to figure out how to do it from IRC. The only way to deal with these types of entitled assholes is to instantly ban them. I don’t care if they are immature gamers who don’t know better. They need a consequence. “What the fuck!? How dare you ban me? I’m just trying to help.” Go fuck yourself. I have a zero tolerance policy for ingratitude. There are a ton of other things I could do with my free time besides provide100/hour private consulting content for free on Twitch. I could be making 150/hour as a CTO or tech consultant IRL. So when some pleeb tells me what to do with my stream without even offering a single thank you for the time and money I’ve sacrified without any agenda and not even seeking advertising sponsorships that’s the only appropriate response. I hate to make this conclusion, but I keep running specifically into gamers who fall victim to this phenonemon. ## Saturday, March 14, 2020, 6:10:13PM Woah, I just found a legitimate use for the style tags from HTML5 that are attributes instead of CSS. The following works in a text browser but does not work with CSS:  <th align=left colspan=3>Weekdays</th>  ## Saturday, March 14, 2020, 5:42:10PM I’m reminded that in the rw tool I need to be sure and detect a localized template.html file rather than use the site-wide one automatically — especially now that YAML can be combined with partials to provide very powerful static view renderings. This essentially turns every README subdirectory into its own stand-alone page as needed while allowing them to share common resources for maximum sustainability. ## Saturday, March 14, 2020, 1:04:26PM Been suffering through “NOVICE mode” in Lynx and forgot to change it to “ADVANCED” which clears all the crap at the bottom. ## Saturday, March 14, 2020, 1:01:05PM God I LOVE Twitch community? Solved a stupid annoying thing I have been working on for years:  bind-key -n C-y send-prefix  This allows you to use a different prefix (C-y) for nested sessions (TMUX or screen). Still need to test it though. ## Friday, March 13, 2020, 1:58:12PM So here’s another take on my selection of callables (things that can be called from the command line) based on increasing needs and depth: 1. Alias 2. Exported Bash Function 3. POSIX Shell Script 4. Bash Shell Script 5. Perl 6. (Ruby / Python) 7. Go 8. Rust 9. C 10. Assembly ## Friday, March 13, 2020, 1:11:13PM Quickly compared links to lynx and the Unicode support in links is still far too sketch (and yes the annoying exit confirmation is still the default). The old lynx has been updated to be better even though it still as a lot of old bloat for unsupported stuff used in the 70s and 80s. The end. Lynx is your friend. ## Thursday, March 12, 2020, 9:05:23PM Realized earlier today that I had not blocked any time on the schedule for coding, only the other stuff. That can’t be. So I divided up Saturday night into two blocks for Go and Rust development and Sundays just for Shell. The C and Assembly can come during the OSCP time. The Python and Web can be during the G0CT time. ## Thursday, March 12, 2020, 5:39:13PM I’ve decide that the CTF games will only be OverTheWire.org until I finish them all since that is where most beginners will start and I want to be able to nudge them as needed at any time. So anytime new OTW content comes out that will take priority. Otherwise HTB Pro will be priority during the three OCSP focused days of the week. ## Thursday, March 12, 2020, 4:18:31PM So I have concluded that I do want to make specific times for the following specific topics: 1. RWX Cafe (Random Tech Talk) 2. G0CT Study Group 3. OSCP Study Group 4. Mentors Guild 5. Hacker CTF Games (OTW, HTB, etc.) 6. 1-1 Private Mentoring 7. Bash Scripting 8. Go Programming 9. Rust Programming Twitch needs to add a Hacker CTF category. What if we had entirely remote CTFs that could be monitored on Twitch, etc? By the way, I love the word “nudge” from the OffSec world. ## Thursday, March 12, 2020, 4:03:35PM Should I alternate day for G0CT and OSCP focused times? One day would be just for beginners working on G0CT and the other would be for only those more advanced working on the OSCP specifically. This also has me thinking that maybe I need to carve out time to talk with other mentors and educators about helping other people learn. Next time I get to talk to strager or iscreman25 I’ll have to ask them. Eventually I need to have the schedule for G0CT very specific. But I really don’t need to sweat the specific details details because I can just change the current topic and most people will be watching the YouTube videos that result anyway. There are lots of other questions about making videos, where they should be published and when. My initial feeling on this is to not push anything to Twitch, only Youtube, and to keep Twitch for live streaming. The highlights on Twitch would then be just to redirect to the stuff on YouTube. YouTube is more sustainable and discoverable for overall content. I’m feeling uncomfortable that it’s been almost a week since I have published a YT video, but all this stuff has to be worked out to save time. For example, had I thought through this just a bit better, I might not have wasted time on unnecessary videos. ## Wednesday, March 11, 2020, 8:23:03PM Tonight I will be going back to two hours of OverTheWire.org since that is the prime time most people in America are online. Then I’ll just push back waking up. And in other news, Imma hack Lynx to not waste the bottom line. It’s not worth using one of the broken new text browsers instead. (Although I do need to look at the latest links again to see where it’s at.) ## Wednesday, March 11, 2020, 7:18:28PM Just happened on a fun term “break-in room” instead of “escape room”. The latter is well known and popular. But what if we made the opposite and used lock picking and hacking in the physical world? I bet it would do well. ## Wednesday, March 11, 2020, 6:54:30PM Can you use nothing but the magic wand !! in vi to replace these with their outputs?. Can you tell what program to use? for i in {1..10};do echo Item.i; done [print(f"Item {x}.") for x in range(1,11)] (234234 * 3423423) / 234

Can you guess the output of each?

Yeah, you probably don’t need a plugin. Just use your shell. It’s the best vi extension ever made.

## Wednesday, March 11, 2020, 4:21:26PM

TIL that the lag from not restarting your stream every few hours or so is insane. Might be from the quality of the watcher. Will have to test.

## Tuesday, March 10, 2020, 8:22:34PM

God damn this Go yamllinter is fast.

## Tuesday, March 10, 2020, 2:19:01PM

“Crowd-source Credentials” “Open-sourced Credentials.” “Open Credentials”

Need a YAML/JSON data structure to encapsulate the open credential requirements.

## Monday, March 9, 2020, 7:26:52PM

So looks like https://overthewire.org doesn’t like having spoilers posted but I don’t feel like I’m “posting” spoilers to be picked up and found. Anyone can decide not to watch and since I’ve disabled VODs it’s just me and some friends figuring things out without giving anything away. It’s really no different than someone being with me and just hanging out working on it.

## Sunday, March 8, 2020, 8:15:29PM

Alias -> Exported Function -> POSIX Shell Script -> Bash Shell Script -> Compiled

## Sunday, March 8, 2020, 3:10:01PM

Twitch wins again!. New friend on twitch just shared this ext-based pastebin alternative https://ix.io. I am in fucking terminal heaven over here. I simply cannot put a value on the number of amazing insights I have had over the last month on Twitch.

I am so fucking blown away by how useful this is. You can create a completely anonymous post from the command line by just sending a file to the ix command after creating a simple ~/bin/ix script like this:


#!/bin/sh argsorin "$*" | curl -F 'f:1=<-' ix.io  The argsorin is a useful script I made some time ago to intelligently detect arguments and smoosh them together or read from stdin for here doc syntax and stuff:  #!/bin/sh argsorin () { buf="$*" if [ -n "$buf" ]; then echo -n "$buf" return fi
while IFS= read -r line; do buf=$buf$line done echo "$buf" } echo$(argsorin $*)  Here’s a simple example:  ix echo hello world http://ix.io/2dLp curl ix.io/2dLp echo hello world  I usually write this stuff as functions so they can be cut and paste easily if people want to just incorporate into their own code rather than invoking a sub-process just for the command. ## Sunday, March 8, 2020, 2:40:42PM Need to take a look at <https://liveoverflow.com> which [fox_maccloud](https://twitch.tv/fox_maccloud) is recommending for picking up memory buffer overflow vulns. Also need to get a resources wiki-ish page ready to hold all these references. ## Sunday, March 8, 2020, 1:18:05PM TIL you can add -x to bash to cause it to output the content of the script as it is being executed. No more multiple echo statements. Huzzah! ## Sunday, March 8, 2020, 1:13:46PM > Zsh, or Z shell, was first released by Paul Falstad back in 1990 *when > he was still a student* at Princeton University. See that continue to be a problem. People make things during college that should never have existed and somehow people adopt them as standards. The simple vimtutor (which is totally broken) is just one such example. ## Sunday, March 8, 2020, 1:11:20PM There's something miraculous about purging the angst of hitting horrible technology (that I cannot legally hack despite my initial impulse). [Shameful Shitty Sites](/shameful-shitty-sites/) is one way to deal with it. Who knows maybe it will pull up in an Internet search one day. ## Sunday, March 8, 2020, 12:12:02PM Still feeling the pain of that major burn setting up integration of Bash command functions with applications that must call them from the exec() system call (rendering them useless). I still feel they are far more useful than aliases and should be used for anything that will be called from the shell. In short, the best way to stay good is simply think of them as shell extensions, period. Going too far and thinking of them as commands will eventually trip you up (as it did me with TMUX status). The interesting question is then what language --- specifically which shell version --- to write when creating actual command *scripts*. To me the answer is pretty obvious: POSIX shell. These days the closet shell to POSIX is dash without question. It is behind /bin/sh on all Debian systems and is *wicked* fast. Since dash is already loaded into memory (like bash) there is a good chance the interpreter loading time is saved as well (which is *not* the case with Python or Perl). So the moral of this sad tale is keep your functions as extensions of your shell and write anything else as Dash scripts in ~/bin. In doing so we keep everyone happy because /bin/sh might be pretty much anything on a given system and because your command scripts are all POSIX there is never any fear of them not running on something. Unfortunately this means that learning the shitty sub-process intensive approaches are still required to maintain POSIX compliance. (Apparently the world isn't ready for extreme efficiency like that.) If you *do* want it, and don't particularly care about portability, just write it in Bash and be happy, but not until you *need* a Bash-ism like good regular expressions or avoiding a subproc monstrosity. After all POSIX was made for a *completely* different world with different constraints. We don't use many dynamically linked libraries now, for example. Everything seems to be into static linking, and usually for good reason. POSIX isn't dead, but it sure isn't required like it was anymore either. As for me? Everything starts as a command function until I need it from another program (exec() syscall). Then it gets ported to a /bin/sh POSIX script. Then if I even have to make one subproc I change /bin/sh to /bin/bash instead and live happily ever after. The end. In most cases someone coming across /bin/bash who is using Ksh or Zsh *should* just be able to change the name. But that's no longer my problem. ## Sunday, March 8, 2020, 7:37:08AM While working on my TMUX status line and realizing that my exported bash command functions were not usable using the #(status) syntax I did some digging and found [this information](https://stackoverflow.com/questions/35522187/how-to-export-a-function-from-tmux-conf). This really had me challenging this new way of managing what I had previously managed as a collection of stuff in ~/bin. In fact, it has caused me to seriously consider when to even use them. Sometime in November I started to rewrite my Bash configuration scripts. I noted the speed of exported command functions to be ridiculously faster than invoking shell scripts instead. So being the freak I am when it comes to efficiency I ported most of my utility commands into command functions instead that are available to be run at any moment using the export -f myfunc syntax, which is *not* available on anything else. Not only are command functions much faster than actual command scripts, they are far easier to maintain since they all go in some version of a bashrc file which can be easily distributed. I have just read all about "ShellShock" also ("Bashdoor") which happened in 2014. I'm completely and totally blown away by the scope of that vulnerability and how I have inadvertently fallen directly into a horrible trap. I now have to help others avoid it and responsibly describe how I have made a very understandable mistake to use command functions *at all* (despite all the shit I say about zsh not supporting them). There's actually a *very* good reason not to (or at least there was). By adding on to the end of the exported environment variable containing the command function hackers created the biggest shell exploit in history. The fact that this bug went for so long without being caught is unfathomable. Frankly it is a huge embarrassment for the Bash team. Everything has been patched, but no wonder so many Linux people have avoided them like the plague despite their utility. Does this means I will be converting to zsh? No way. But it does mean that creating a ~/bin directory (as I have done for more than two decades) is *still* the best way to manage all your utilities. The single best criterion as to whether something should be a Bash command function or an actual command script is whether or not it will every be used by anything but bash. So, for example, creating command functions for stuff that is frequently paired with !! in vi is fine since it always shells out to Bash, but any of those functions are to be used from anything *but* Bash it might not be so useful. This has me still horribly conflicted between the modularity and convention of the ~/bin approach versus the decidedly *only* Bash approach of using command functions. ## Saturday, March 7, 2020, 2:54:43PM So looks like there is color emoji support in Alacritty in the latest version by providing a font "fallback" for them (instead of the black and white versions). Very helpful people on the IRC channel. ## Saturday, March 7, 2020, 1:34:04PM Been reminded recently of the importance of learning a command-based editor like ed (after recently talking about Rob Pike and his famous "ed is the only editor you need" statement) is the *only* editor you often can use when you don't have visual mode after hacking into a system through a service that does not support a full TTY. I remember doing that with Metasploitable some years ago for our Summer camp and being like, "Hummm, so there actually is very real value in learning ed." Now I'm trying to determine if ex (the successor to ed) is just as good. I notice that it is now combined with the man page for vi so not sure. I want to create a simulation of a situation where ed is all you get and how to do that. I think something as simple as a basic script tied to nc should do the trick, sort of a local, hackable service just to simulate lack of a full terminal environment. In fact, a friend from the Twitch stream shared the fun fact that Joy gave up on vi because there were so many different versions and just used ed instead. vi didn't even start in visual mode by default. He added that a couple months later after noticing that nobody was using it as a line editor. Apparently there's a whole [video](http://xahlee.info/comp/interview_with_bill_joy.html) and [another](https://www.theregister.co.uk/Print/2003/09/11/bill_joys_greatest_gift/) on the history of it all. (God I love having a stream community.) ## Saturday, March 7, 2020, 1:29:18PM Out of curiosity I actually looked up how sed suddenly had in-place editing. Turns out it has been in the GNU version the whole time, which is why I always used perl -p -i -e instead because I knew it would be on *every* UNIX system, not just GNU stuff. But given the widespread use of Linux now I suppose using sed -i is good to know. It's not unlike things like mkdir -p /some/long/path which is unsupported in any of the original POSIX mkdir versions (another thing GNU added). I use it all the time now, but knowing when and how stuff changed is something of an obsession. ## Saturday, March 7, 2020, 12:41:19PM Gabe brought in his new Razer Blade 15" 2019 is reporting on it. I'm completely convinced it's the go-to system for most pentesting and forensics (not to mention best gaming laptop). He had problems initially with suspension from screen closing (because he installed the default Mint kernel on 19.3 ISO was too old) upgrading the kernel fixed it. The chassis is the most solid I've encountered and I *love* the keyboard, sturdy and chicklet-y. I *destroy* keyboards so that is a bare requirement. Definitely my next purchase when ready to go mobile again for everything. My (bork) XPS will have to do for now. Gabe says make sure to get it from Amazon since price and warranty is better. He bought the four year coverage including accidents. ## Saturday, March 7, 2020, 12:26:40PM GitLab *sucks* when viewed with Lynx. Need to open a ticket. ## Saturday, March 7, 2020, 12:23:12PM TIL that screen now has splitting panes. The question came from a long-time screen user about why he should consider TMUX. A bunch of things came to mind, but these are best summarized [here](https://superuser.com/a/279167). The author left off the customization of vim bindings for navigation between panes and resizing them, which I could not live without. I did find it extremely validating that the screen default bindings for splitting us the same exact keys I have set in my [tmux.conf](https://gitlab.com/rwxrob/config/-/blob/master/tmux.conf) file (rather than the bork defaults). ## Saturday, March 7, 2020, 11:13:57AM Back on standard Xterm colors. There is something about it that is beyond "retro" feel. The colors are very solid. The contrasts are amazing on any device at any size and dimensions. And people see what they will likely see when they pull up a terminal for the first time on Raspberry Pi or whatever. Plus asquiiquarium looks as expected. ## Saturday, March 7, 2020, 10:28:07AM Last night someone showed us a video about a seriously mentally ill genius who created their own compiler and operating system and everything that goes with it including some really crazy graphics. At first it was intriguing but by the end I was sad and pissed for having seen it. I know the person mentioning it wanted to explore the possibility of making an OS and was pointing out the extreme intelligence involved, but the video seemed to linger and dwell on this person's descent into madness for no other purpose than to sensationalize the drama. I suppose capturing the story is one thing, but showing all the videos of his worst moments crossed the line. I didn't need that. It triggered my disgust with people who laugh or just overly sensationalize any mental illness. I think I might of hurt some feelings without meaning to but it's not okay. Mr. Robot deals with that topic so perfectly. ## Saturday, March 7, 2020, 1:04:02AM So today's OTW had a lot of SSL port stuff in it that was nice to see because last time I did anything with port connections and enumerations was in the 90s (other than for fun here). The combination of openssl and s_client is absolutely amazing. It takes all the hassle out of the SSL/TLS complexity when connecting to ports allowing connections to SSL enabled ports without having to negotiate all the cert shit. Here's one to remember:  openssh s_client -ign_eof -connect <host>:<port>  It is the same as plain old nc of old. It waits around to see what you type in and otherwise passes the whatever you paste in. The -ign_eof holds the connection open for interactive sessions. Looks like you need this and cannot just pipe to stdin as you can with nc but I’ll research that more later. ## Friday, March 6, 2020, 10:13:32PM Turns out the sound corruption is from OBS Studio. The fix is to exit OBS completely and the app using sound (usually the web browser) and then restart the app and restart OBS. ## Friday, March 6, 2020, 7:56:55PM With all the focus on pentesting it has really warped my sense of software development. For example, learning everything there is to learn about CSS is simply not needed. Creating the G0CT has been one of the best thing for SkilStak because it has really forced me to focus on only those things that have very broad application to all foundational technology careers. For example, there is no need to learn Vue or React, only vanilla JavaScript with a solid understanding of the DOM and focus on HTTP (along with other networking). This is coming up now because even though I stand by the recommendation to get Jennifer’s Learning Web Design book most of it is irrelevant to the G0CT. It has such solid information in it that everyone should eventually work through the whole thing. But I plan on getting very specific about the sections and projects that are required just to get a good initial understanding leading to the certificate evaluated project of creating a personal/professional web site from scratch with the minimal requirements: • Responsive • Semantic (justify every div) • Pandoc Markdown Generated The title of the sub-cert, Knowledge Content Creation, also conveys the goal of it better. Sure understanding cross-site scripting vulnerabilities will require a lot more JavaScript understanding later, but not to get started. In short the G0CT should equally provide a foundation to all of the following fast-growing career paths: • Full-Stack PWA Developer • Senior Software Engineer • Systems Applications Developer • Site Reliability Engineer • Machine Learning Engineer • Embedded Systems Engineer • Offensive Security Professional As I go though the vetting process for all the content I already have and decide what stays and what goes or gets put into another category or cert I’ll be applying this check list to make sure whatever is there applies to everything. Earlier I was struggling a little because I had “Web Developer” in the list, which does not require Linux and command line skills beyond basic navigation. But a Full-Stack PWA Developer needs much, much more — including a full understanding of Bash scripting, HTTP, and the use of curl. ## Wednesday, March 4, 2020, 4:46:42PM Rather conflicted at the moment about this interpreted language to pick back up again. All of the following are supported by weechat and are prevalent in the Metasploit database: • Python • Perl • TCL • Ruby • Lua They are all mandatory languages at some point to learn. The question which do you choose when all are available. I did find it interesting that there is no Node in any of them. I think, as I concluded earlier, that I have to get over my “bad breakup” (Brian Cantrill’s term) with Python and just accept that it is a required secondary language to maintain. Here’s all the justification for that: • Massive user base and only getting larger. • Machine learning (numpy, Tensorflow, etc). • Python will always be the darling of scientists. • Python is in everything already. • Pentesting uses a lot of Python. • Python Fabric has no substitute. • Python data structures seriously dominate. • Python C stubbing it best in class (for interpreted). Have to look at nim as well. But as a general purpose language Python is mandatory learning for the G0CT certification, there is no doubt of that (as much as I might hate that to be true). ## Wednesday, March 4, 2020, 12:57:52PM Discovered today that the pretty graph that shows the volumes of a sound stream is called a VU meter. Turns out there is a really good one for terminal, which means, of course, that I had to get it. God I love what is possible with full color terminal support all over the place now. ## Tuesday, March 3, 2020, 8:55:32PM For the first time I actually miss Arch Linux. It definitely has more up-to-date packages and shit I want to use (weechat) is waaaay behind on the standard Mint distro, which is to be expected but still annoying. This means I have to make a decision, leave Mint and go down the Arch rabbit hole just for my own stuff while maintaining a healthy relationship with Mint so that I can help beginners get started on it, or keep Mint and suffer through extremely old stuff. There is no doubt that the core Debian team is on top of things, but the Arch team is where all the edge energy is from the Linux development community. That’s a given. I’m torn because I want to join that community (again) and to be there you really just have to have the distro they have decided to use, which seems to be Arch at the moment. I’m inclined to put Arch on my main system, but doing that would mean that anyone on the stream who is just beginning would maybe be misled to go with Arch before they are ready for it. I had considered just using a VM or streaming a Kali system next to me (which I’ll have to do anyway for OSCP prep). It is Debian after all. But that has all the stuff I need on it already and I would not be showing how to do installs much. I can give this some time and see how much Kali I’ll actually be using all the time. I could make Arch my main distro but usually have it set to Kali for most work for the OSCP prep stuff and even treat Kali like more than just Kali. I wonder how well a Kali Linux system would stream. If it streamed well I could set that as the main system since almost all my work will be using tools that are on it and showing what those tools are. It makes sense to consider how easy it would be to setup a stream from it. It’s likely that the problem without outdated packages and not being on the edge of Linux development will also affect Kali, possibly more in fact. I have some time to think this through. I could use this capture card and use a projector to have more than one CPU associate with a single screen essentially and could switch back and forth between them just with OBS. ## Tuesday, March 3, 2020, 6:14:41PM WeeChat has definitely sent me down a great rabbit hole. In fact, I no longer find even the slightest interest to participate in any chat that is not IRC. I completely understand why the “1337” hackers only use IRC, it is so ridiculously more productive than any graphic chat application, that much is clear only after about a hour playing with weechat. I so so glad Twitch has built everything on it. I could not be more pleased with their decision. Time to go find some more Freenode channels. ## Tuesday, March 3, 2020, 5:23:29PM I absolutely love that weechat is entirely written in C. I’ve decided that is the first thing I’m going to learn in the IRC land. It support Python and Perl plugins, and frankly to become a seriously good Pentester everyone should learn Python, Perl, Ruby, and TCL as well as Bash even if Bash can do everything all the others can. People need to at least be able to read the others at least since so much that is written in the Metasploit database is one of those languages as well as plugins for different hacker-y things like weechat, which is really the new BitchX. ## Tuesday, March 3, 2020, 5:09:56PM Python is officially back in the house. Python most perfectly fits that place of making significant software that is not yet worthy of a full port to a statically compiled language. It is the best for data analysis and it’s dominance of systems automation with Fabric means that it will retain its position as a systems glue language despite Go’s supremacy. It is a very small niche, to be sure, but enough justify learning it for any G0CT. The dilemma I have is to bring Perl back in addition to Python. It is still a very powerful language, far more powerful than Python specifically for pentesting. Things can be done so much more quickly in Perl. ## Monday, March 2, 2020, 8:39:45PM After hacking my way into https://www.hackthebox.com (don’t forget the www) and watching the earlier OSCP guy’s video it has become clear that there is another mid level between G0CT and OSCP that involves everything needed to even create an account on hackthebox. It requires significant JavaScript knowledge as well as HTML and HTTP and base64 encoding. Those are not beginner things in the slightest. ## Monday, March 2, 2020, 5:34:51PM This guy says that his “report” was 240 pages of markdown and screenshots converted to PDF. So yeah, knowledge source management is a critical dependency skill in all of this. Love that this validates including it in G0CT as well as everything with the README World Exchange. Good advice on the “memory buffer overflow” machine: “Setup your payload and shell code so you can go back to it, not you … That way there’s no need for them to regenerate shell code. There’s no need for them to create their rhosts and rport. Make it one-way so you can callback to it, not to you.” The callback is the thing running that grants access. He’s saying to put all that code on the remote machine so that the overflow code calls that and not something that initiates a call for that code from your local system. That’s pretty obvious stuff because that is exactly what a backdoor is, opening something you can login to the remote target. Reading that the term “enumeration”, which has a rather specific meaning in the software world just means capturing your discovery results as the initial phase. ::: Enumeration is used to gather the below * Usernames, Group names * Hostnames * Network shares and services * IP tables and routing tables * Service settings and Audit configurations * Application and banners * SNMP and DNS Details Enumeration can be performed on the below. 1. NetBios Enumeration 2. SNMP Enumeration 3. LDAP Enumeration 4. NTP Enumeration 5. SMTP Enumeration 6. DNS Enumeration 7. Windows Enumeration 8. UNIX /Linux Enumeration ::: Enumeration is more than nmapping, it is “enumerating” all the information available from a given port. So SNMP has a lot of information about the system (snmpwalk). The fact that “enumeration” is considered a basic foundational skill for OSCP implies there is a lot of base learning required far beyond what I was planning for G0CT. I happen to know a lot of it, but someone who hasn’t been a system admin or doing any patch auditing is going to really feel left in the dark on that stuff. Something like G0CP -> ???? -> OSCP that would contain all the material from lpic-0, klcp, lpic-1, network+, a+, security+, but just not suck so badly. Apparently this guy “took a lot of breaks” which is going to be really important for me. At least the test is from the safest place you can find and not their testing center. Instead of “try harder” he suggests: “Take a break and try your process again.” I really love that. Trying harder means you’ll just produce cortisol and it will block you from seeing what will be obvious to someone else or yourself after you’ve take a good enough break. I love that he didn’t use Metasploit at all. I fucking hate it. It’s a script-kiddy thing. Also he says don’t waste time on kernel exploits even though the kernel versions are old because they are not intended to be the main vectors. He finished in eight hours but he sounds like he was really prepared. ## Monday, March 2, 2020, 5:21:23PM So the first thing I learned about the OSCP process is that it takes up to a month just to get your packet and lab access after your application, and another full month before your request to take the actual test goes through. That means there is at least two months of lag time for those who might not even need that long to do everything (which I definitely will). This means that there is no way I could reasonably hit the OSCP by DefCon, instead DefCon will be an amazing way to tap the minds of a bunch of people who have received theirs and potentially record them to share with others. If anything DefCon is sort of a mandatory expense (to me) to covering as much as I can during the process and documenting it for everyone. I also need to spend most of my free video watching time watching all the others who have received their OSCP and start cataloging them. ## Monday, March 2, 2020, 5:11:36PM Can I just say how fucking happy I am to be free of concern for teaching and keeping up on web shit. The only thing I ever want to learn about it is how to hack it. It has created such a level of stress for so many years. It is a good skill and creating a JAMstack site is required learning, but all the rest? Fuck no. I’m done. It’s so completely liberating. Mostly I’m just happy to be rid of the entire web development — specifically JavaScript community — which is filled with horrendously horrible human beings. ## Monday, March 2, 2020, 4:51:42PM Perhaps the coolest thing about the OSCP is that it fundamentally forces you to build your continuous learning skills and creativity. This is exactly what terrifies people because they haven’t been taught to do research and learn on their own. They’ve had their creativity beaten out of them by the school system. The OSCP demands high technical skills, but more than anything it requires solid research and self-evaluation skills. ## Monday, March 2, 2020, 4:32:28PM Found a great OSCP resource that I think does a good job covering what people should expect. It was interesting that Assembly and memory buffer overflows are a part of the OSCP. That definitely puts the OSCP in the category of not-for-noobs. In fact, unless you have done some C coding I have a hard time recommending it to anyone. I found the resource while looking for the required languages for the exam. Obviously having a really solid handle on shell scripting is required (which I have), but specifically I’m wondering if they have a Python bias anywhere in their requirements. One thing mentioned is the amount of research required to complete the 55 labs of different difficulties. It made me smile that I can research faster than 99.9% of the rest of the population with Lynx and the techniques I’ve developed over the last 20 years. Hacking is really about knowledge more than skill. It confirms my selection of content for the G0CT I’m creating. Reading the overview of OSCP prep is also good to make sure I have all the building blocks leading up to any tech career, but specifically for someone headed to the OSCP. Linux, for example, is not fundamentally required to become a web developer. But it is for pentesting for sure. ## Monday, March 2, 2020, 4:15:09PM I’m so glad that my subscribers went down by two. I know that is a weird thing to say, but the reason is because it means that people are self-filtering, which is great. Here are the people who leave: • Those who think setting up a ton of desktop-specific hotkeys is actually a good idea. • Those who think game development is the only thing that matters. • Those who think for some reason that old guy with the beard has nothing important to say. • Those who are offended by my opinions but unwilling or unable to challenge me on them even though I’m very open to new ideas. • Those who have no fucking problem sucking every second of my time online without offering anything in return to those in the room. There. That’s all I needed to get off my chest. Now on to stuff that matters. ## Monday, March 2, 2020, 11:18:03AM Did about an hour of research on SourceHut.org and ultimately concluded the following: • Obsessed with FSF in the best way. • Great principle engineers with C and Go in their main profiles. • 64% Python Flask implementation, 3% Go. • REST API with no public plans for GraphQL. • No Twitter account. • Highly transparent business with financials etc. • SSH into containers used in CI. • Mercurial support. • Minimal search ability. ## Sunday, March 1, 2020, 2:30:46PM The key to making this work is not going to deep into any of the things covered: • Go up to and including concurrency and modules but not further. Capstone is a multi-service virtual agent tied to Twitch, Discord, Twitter, and GitLab. • Bash capstone will be a fully realized bashrc in a config repo containing prominent examples of everything Bash 4+ has to offer including stuff using curl, jq, and no sed, awk, tr, or any other ancient shell sub-process shit. One thing that will be rather sensitive to deal with is that the G0CT will take the Mr. Miyage approach. In blunt terms, my way or nothing. That does not mean I’m not open to challenge and progress and change, only that I will require everyone to master the way I outline first and then decide how they would like to modify it. In other words, no emacs, no zsh, no Arch, my tmux.conf, and so on. This will be a bit difficult and will no doubt cause a lot of controversy but frankly I don’t give a fuck. You want my cert, you do it my way. The end. :) Since everything is creative commons people will be able to essential fork my cert material and make their own based on it. But I will trademark the G0CT so that people do not misrepresent this stuff. ## Sunday, March 1, 2020, 1:17:38PM I’m focusing on the following areas at SkilStak: 1. Ground-Zero Certified Technologist (G0CT) 2. Offensive Security Professional (OSCP,OSWE,OSCE,OSWP,OSEE) That’s it. Anything outside of those is not something I will spend any time on. I will assist people during their sessions with their own projects outside of this scope as much as possible. But I (and SkilStak) will not spend one second focused on producing or researching content for anything outside of this scope. This focus should rocket-propel the production of good and current content over the next year, hopefully before DefCon in August. The question becomes, “Got your G0CT yet?” If the answer is no then you have really no business going further for any of the Offensive Security stuff. But once you do have a G0CT you can charge down the Offensive Security specialization path (or any other of your choosing outside of SkilStak). The next question is, “Do you have a degree?” If not, then WGU Bachelor’s in Cybersecurity and Information Assurance is my recommendation and what I’ll cover (even though I won’t be getting one myself. After you complete WGU (or if you already have any degree) go straight for the Offensive Security certifications. This is good to be coming up now because I have been redoing skilstak.io and will change it to be on this focus. Ironically I’ve come full circle back to a “stack of skills” with a specific purpose: to provide a solid foundation on which to build everything else. ## Sunday, March 1, 2020, 12:41:41PM Loving that the newest iteration of content is focused entirely on personal empowerment that applies to anyone, not just people going for tech occupations eventually. For example, Knowledge Content Creator (KNOW) is not web development but includes simple web development as a part of the self-publishing process. Adding Self-Orienting Continuous Learner (SELF) is also a huge win. I finally have my finger on that sort of ephemeral thing that so many take away from SkilStak without formal training in it, although to be fair many come to me having already learned most of it. It all just feels really good, really right. ## Sunday, March 1, 2020, 12:04:16PM One thing is for sure, all these years of working with rather young people (and old people who declare themselves as “non-technical”) has really been an advantage. I sort of intuitively find myself explaining everything more like I would to one of these people instead of assuming a lot of technical expertise (like so many other resources and teachers seem to do). I got to be sure not to lose that as the level of base tech skills increases in the average person I might be helping. ## Sunday, March 1, 2020, 11:01:50AM Working on a separation between the main skills and ability categories and struggling a bit with where something like ProtonMail would fall. It doesn’t go under Modern Computer User, hummm. ## Sunday, March 1, 2020, 8:56:13AM It’s become clear what comes next for SkilStak. Providing my own certifications has always been something I wanted to do and the clearest way to do that is now materializing pretty clearly. I have setup certification based learning with merit-badge-like requirements before but was overwhelmed with all the content required. Now that I’m laser focusing on the two following certificates I can use certification to create a crystal clear path to learning the skills I’ve learned are most critical and yet have no formal method of learning or validating: ——- ————————————– INIT Technology Initiate USER Modern Computer User SELF Self-Orienting Continuous Learner NOTE Knowledge Content Creator TERM Linux Bash Terminal Master SUDO Multiplatform Desktop Administrator HOME Home Network Administrator PROG Pragmatric Golang Programmer PRIV Privacy Advocate KNOW Prescient Technology Professional ——- ————————————– After getting those there will a capstone (like Eagle project), a paper, a thorough personal portfolio review, and a final one hour board of review before three professionals from the fields of software development, operations, and security. I have been breaking this material up and learning it in stages with specific goals of one of the following projects that people want to regularly do who come here in person: • Web page • Web app • Game • Job • Robotic arm • Minecraft server • Hacker skills That pretty much covers all the stuff most people walking through the door want to learn, but they are a different group of, um, “customer” I suppose. Streaming has broadened the number of people interested and provided an opportunity to hone in more specifically on expertise in a few core areas. In other words, these things will still be possible, but won’t be stuff I specifically focus on. ## Saturday, February 29, 2020, 7:11:03PM The nature of learning is something so many don’t learn. ## Saturday, February 29, 2020, 5:22:24PM I’m so jazzed about all this direction now, not just because I want to do it myself but I have never seen a cleaner path to a tech career that also has immense value for the state of our world today. My new goal is to get as many of my mentored community the certs outlined in the previous post as fast as possible. I’ve been ambivalent toward the entire question of getting certs in the past because of their dubious value capturing the actual skill set you have. But the Offensive Security certs are 100% hands-on. They are exactly what we need to see more of in the world. Besides, the entire pentesting occupation is fundamentally tied to certs for most professional work if for no other reason than to prove your hacker skills are not only legitimate but that you are not (to use their imaginations) some black-hat in a hoody. Cert city here we come. This also means that I am going to DefCon no matter what it takes every single year and that the entire focus of both my rwxrob.live stream and SkilStak IRL community is 100% on foundational tech skills and building pentesting skills on top of those. This means there are categories of learning I’m focused on: • Continuous Research (Be Prescience, Become Your Own Guru) (PTP) • Modern Technology Foundations (MTF) • Offensive Security Professional Certification (LPIC0,KLCP,OSCP,etc.) In fact, Imma formalize my PTP and MTF programs into a hands-one VPN lab certification as well. I’ve been meaning to provide my own certification and that meshes well with the rest. Now that the game development and web applications development is chopped I can focus on formalizing those certs from SkilStak. The rest are all covered by other people. I really like that sound of all that. The PTP will be the first cert I have ever heard of that tests your certified skills in keeping up, doing your own research, practicing self-evaluation and assessment, and remaining prescient at all times. I need to cut some fat from MTF as well. Presumably stuff that is already covered by the certs. ## Saturday, February 29, 2020, 3:47:10PM Just found out that all of the Offensive Security certifications do not expire. Nothing says quality and trustworthy than that — especially when compared to the shitty certs from Pearson that are multiple-choice questions and have to be renewed every three years. That is really fucked up. Then again, the LPI certificates are pretty trust worth and they require renewal every five years. Still, the certs are a solid path to employment that has been proven over and over again in the industry, faster than degrees and cheaper by far. Looks like I’m going to work toward (and suggest others get) the following probably in this order: Fastest possible: $120 Linux Essentials Basic Linux mastery 0 $450 KLCP Compli ments Linux Essentials 0$100 0 OSCP Mother of all RedTeam certs. 0 —- –

Blue team and system administration related path after OSCP:

## Friday, February 21, 2020, 8:51:28AM

Looks like all the reasons to have any presence (code) on GitHub have dissolved. In fact, there are very few things that GitHub has that GitLab does not:

• A lot of members.
• Easier (albeit ancient) web hosting.
• Stronger likelihood of being “discovered”.

No seriously, that is it. And honestly those reasons don’t hold water any more.

## Thursday, February 20, 2020, 9:52:56AM

I’ve noticed that being subscribed to a mailing list and being required to deleted each email as it comes in actually has value because I gives an immediate and ongoing sense of how much the topic is being discussed, how hot it is, for lack of a better word. I’ve noticed that when the volume of a certain list goes down, say the Sites in Search Working Group, that it indirectly indicates if there’s a problem. When there’s a spike in email from one of these lists, it is a good idea to have a look at what’s going on, because it’s probably significant and probably hasn’t social media. In fact, many of these topics never hit social media at all but are critical to understand.

This morning was the first morning that I actually got to review Twitter as usually do. It’s important that I make time for this because of all they critical data that comes through it. People discount Twitter all the time, but it has an invaluable resource of news and critical current information about tech in general and tech as it relates to my specific interests. It’s a hard sale to prove to most people. They think Twitter and they think Kardashians and trolls.

## Wednesday, February 19, 2020, 8:35:39PM

Imma make this blog work with just Bash and Pandoc for now. I’ll come back and do it right in Go later. After Bash is amazing for prototyping and gluing things together (just like Perl used to be famous for doing).

## Wednesday, February 19, 2020, 7:13:00PM

There’s something interesting happening with Arch, however. I keep saying that you won’t find it in the enterprise, but the enterprise is changing. Work is decidedly moving to remote models, models where the worker is responsible for their own IT, their own computer, their own security and their own operating system. Could it be that saying Arch isn’t ready based on it not being “enterprise ready” be looking at the wrong thing?

I wonder if GitLab allows people to use Arch and connected to their VPN to get work done. If so, that’s all we need to know. In other words, Arch could potentially already be “enterprise” simply because the enterprise is composed of people working remotely use use whatever the fuck they want. Whatever gets the job done the fastest.

I suppose this is particularly true for cybersecurity — especially bug bounty hunting. No one cares what Linux you’re using if you can break shit, and find broken shit.

Suffice it to say, this has be questioning my position on Arch, but as long as the core Linux certifications and cybersecurity require core knowledge of the less sexy distros I’ll stick with having beginners start with Mint. Mint is 100% stable. I’ve never had a problem with it. Arch is not for beginners no matter how much they feel like they are ready for it.

People who install Arch have the biggest backfire effect response of anyone I talk to. They refuse to accept that Arch is a really bad operating system for most everything in the professional world. I get not wanting to use the equivalent of AIX on your personal system. But the amount that people are completely unwilling to accept that putting a more mainstream OS on their system is fundamentally better for them and their learning is frankly hilarious. There’s simply no convincing them, until they come to the realization on their own when they ask, “Mr. Rob, so how can I get a job?” or “How can I get certified the fastest?” Arch is currently not in any enterprise IT shop on the planet and I’ll pay $50 to the first person who finds one. It is certainly poised to make the jump for many things, but has not yet. It isn’t included in any exam and people continually thing that Arch is similar enough to Debian based systems to have their learning transfer. It doesn’t, period. But unfortunately so many are so set on getting what they falsely perceive as the “cool” Linux that they shoot themselves in the foot and won’t listen to reason — especially from some old, grey-bearded Mint user. I’ll enjoy the company of some of the worlds most intelligent technologists knowing they understand even if no one else does. ## Wednesday, February 19, 2020, 6:36:24PM Need to remember to bump up the cam resolution to 1920x1080. Forgot that this last time around and the difference is pretty significant, particularly for things like smoothing and saturation. ## Wednesday, February 19, 2020, 3:57:16PM Need to convert the color changes I made in my person vimrc for Pandoc into my vim-pandoc-syntax-simple plugin. ## Wednesday, February 19, 2020, 9:10:59AM A few small things to remember about streaming: • When setting up video toggle hotkeys don’t turn them on through OBS. Instead always use the hotkey so the toggle doesn’t get out of sync between scenes. • Decided to change everything about the stream between streams and make the notification the same as the title. That way no one wanders it thinking we’ll be coding when we are just chatting. I wanna be respectful of people’s time. • It looks like changing everything from IRC is possible, just not from Hexchat. Need to look into that. ## Tuesday, February 18, 2020, 11:02:19PM Rairden showed us a great regular expression search plugin for Chromium browsers. Also https://regexr.com/ for testing regular expressions. ## Tuesday, February 18, 2020, 9:10:03PM Every time I encounter a shitty site that requires JavaScript — especially those that are documentation like Go tutorials or IRC support — I cannot contain my rage. They trigger me as much as stepping in a full pile of shit IRL. So. The remedy will be this Shameful Shitty Sites page. Perhaps someone might even see one on one of my videos while I mock them to death for how horribly moronic the devs are. It’s one thing to not understand what you are doing (just one pile of shit) but it’s quite another and just do it anyway. Those people deserve all five piles of shit bagged up and left on their doorstep on fire. It might not ever be seen by anyone, but it at least lets me get it out of my system and move on. Ah, I feel better getting that shit out already. In fact, I plan on having a weekly segment called Shameful Shitty Sites where I throw any possible monetization out the window and go full Rob Sterling on their asses calling them out by name if I have to. Someone has to fight against this tidal wave of absolute idiocy. It’s actually hurting people’s lives and they either don’t know or knowingly don’t give a shit. (And yeah, I’m definitely adding the mature rating to my stream.) ## Tuesday, February 18, 2020, 7:57:57PM Need to wrap up that vim-pandoc-syntax-simple plugin because people are looking for a good way to “color” their Markdown. Should finish that before doing the video on creating a full static site generator in 20 lines of Bash with Pandoc. ## Tuesday, February 18, 2020, 6:33:46PM Watching the very personable @Adam13531 do VSCode programming on a Discord bot and have a bunch of things going through my head: • Need to be really ready for absolute novices coming to the stream eventually. • Must have my own IRC agent completely finished before ever reaching that level of activity. • No streams on popular topics until all of that is complete. That includes nothing about Discord bots, PhaserJS games, “Hacking” and anything else that might attract a script kiddie or someone taking a break from playing their favorite game for four hours a day. I have nothing against those people, just want to be really for a ton of absolute beginners. • I never want my bedroom to be on a stream, ever. • Watching people look up documentation using a graphic web browser is absolutely excruciating. • Glad to see that class has reached mainstream enough in JavaScript to be picked up by Adam. • OMG! Intellisense is so annoying and noob. I’m embarrassed for him watching it. But I’m in the wrong there. I shouldn’t care. • Apparently picking .live was the right domain. Looks like that’s what he did. I am so fucking jazzed that I got rwx.tv and rwx.gg. ## Tuesday, February 18, 2020, 4:24:59PM Having a lot of initial success with the group with Eloquent JavaScript in combination with an open JavaScript console next to them as they read along in the book. Don’t know why I didn’t think of that earlier. It’s the perfect way to do as you learn to help things sink in. I had done something similar with the whole codebook concept before, which was good because that was Python and Go as well. This method does depend on having an Internet connection, but with the option of downloading or buying the book and doing the same thing offline, so I meets the what-if-everyone-did-it test. I really love this approach because there is absolutely zero setup involved, nothing to install, no books to wait to arrive, no Internet setup, no Linux installation, just raw reading, coding and self-evaluating immediately. I must remember who suggested I look into this book. It is really good. ## Tuesday, February 18, 2020, 1:52:49PM Really need to put some effort into quantifying, capturing, and documenting important cognitive skills such as self-evaluation. ## Tuesday, February 18, 2020, 1:15:33PM I really need to figure out how to change the steam to “Just Chatting” on occasion to talk through issues I find super interesting like leaving Mormonism, divorce, history, even politics. Still wondering if I need a different account for that or if setting the topic is enough. ## Monday, February 17, 2020, 8:33:31PM JavaScript’s case statement is so lame after using Go’s for so long: • No commas • Forced break  switch (card) { case 2: case 3: case 4: case 5: case 6: count++ break case 7: case 8: case 9: // hold, no change break case 10: case 'J': case 'Q': case 'K': case 'A': count-- break }   switch card { case 2,3,4,5,6: count++ case 7,8,9: case 10,'J','Q','K','A': count-- }  And yes that works in Go with 'J' because it is interpreted as a rune (sort of). ## Monday, February 17, 2020, 3:49:46PM Imma keep the VODs around for seven days. Eventually they will be subscriber only. I figure that gives everyone enough time to filter through all the wild, random conversation to find what they want, if that is worth it to them. Then I’ll make the videos continuously as I have been that are highlighted. ## Monday, February 17, 2020, 12:19:48PM I’ve concluded that turning off the steam is much more effective for focusing my life and producing the best possible output, both for the stream and for my life. It let’s me let go and be in the moment of Yoga, eating, cleaning, doing administrative tasks. It causes me to look forward to completing tasks to get to the point where I can stream and focuses my mind when I am streaming. It’s no stress that this conclusion is just a form of another rather famous one. Be present. Be true. Be learning. ## Monday, February 17, 2020, 10:23:43AM I’m realizing that live streaming is really just “teaching by example” as it’s called in Sunday School. When you are doing pretty much everything in front of others — especially since they can replay it — you are giving them an opportunity to learn from your experience — including your mistakes — as they happen. I think it also makes you more relatable. ## Monday, February 17, 2020, 10:08:06AM Rather than just ignore all the email for the week related to news. I will go add them to WeeklyNews (perhaps even organized by week) and sort of pre-filter them to reduce time when I do a news review on Saturday morning. That seems like a happy medium between doing a more official News segment like I tried, and just reading everything without never opening at the end of the week. I must keep News under one hour. Perhaps this will save me from the emotional rants that happen spontaneously because I am reading things in real-time and not have time to calm down and think about how to talk about it more rationally on Saturday morning. Discord responses, however, will be raw and my first time reading them. In fact, thinking of things in terms of how they will streamed and documented and shared as videos has really forced me to put some more structure around my weekly workflow (and personal), which is a good thing. One of the things I need to better at is seriously focusing for a moment or two, and then allowing myself to read the chat in order to get things done. I’ll be sure to have Q&A where I do nothing more than just read the chat, but if I’m going to live-stream everything I have set some boundaries or I know I’ll get nothing done. Overall streaming has made me much more productive and way more informed both of which are key to meeting my goals of helping people learn what they need to be current and stay current. I’m not talking about chasing trends. I’m talking about improving the core ability to curate effectively. ## Monday, February 17, 2020, 9:53:08AM Did my first new and Discord chat review. It all went well and decided to use iconography for each thumbnail to make it easy to sort. Also tested Discord screen sharing and voice and there was no echo. So I’m good for doing interviews, private sessions, and even multiple voice participants on stream for topic podcast-y conversations. The sky is not officially the limit. This means we can do any — including everything that I do in person. Theoretically I can hold ever single one-on-one session that I already do IRL over Discord and/or Twitch. This implies a lot: • Members who miss because of sickness or distance can still have a remote session. • Remote sessions with more than one person can be attempted so long as they are able to get their desktop to work in a combined. I think there’s a good multiple user streaming app for this, perhaps even Discord. I’ve about finalized my decision to “age out” those I help IRL with those who are in need online, including possibly more than one person in the group so that I can make sessions even more affordable and attainable. Of course this is all tentative at this point and I don’t want to let anybody down by not fully testing the whole approach, but I’m very hopeful. The fact of the matter is that I’m meeting people on the stream who are far more likely to benefit from what I have to offer than some of the younger members here who can barely find time to code during the week and are essentially just like kids taking piano lessons who don’t practice and show up for a lesson each week having made zero progress. I don’t have many of these, but I do have enough now that having them here while knowing someone who is more motivated than them is unable to get help annoys me enough to make a change. In short, passing an application interview just a lot harder because the waiting list just blew up with motivated people from all over the world. Local people are now directly competing for my time with highly motivated people around the globe. So if you were born with a silver spoon in your mouth and you think throwing extra money at me will ensure you sessions and time with me, you will be surprised. That sounds so arrogant, but it’s the truth. I don’t give a shit how much money you have. The only thing that matters is how driven you are to learn and take ownership of your learning. (God writing that last sentence felt good.) ## Monday, February 17, 2020, 9:52:36AM I like tracking time in weeks. It’s been good to get a sense of time passing a bit better. ## Sunday, February 16, 2020, 2:56:14PM Next time I do a video about Git hosting it has to include a lot more than just GitHub vs GitLab, namely the following: There are other options is the point if you don’t want to play the big “Facebook for developers” (GitHub) game. Now that I’ve had a little more time to think about it, I’m really convinced that hosting repos all over the place is just not my thing. I cannot see a compelling reason for it anymore, the only significant reason was for visibility and to make it easier for those who already have a GitHub account but anyone unwilling (or unable) to use GitLab is probably not someone you want to see a PR from anyway. People who do not understand the substantial advantages of GitLab’s foundational CI/CD integration (and all their other advantages) either out of ignorance or simply an inability to understand are not worth having on a project team until they do understand why. The end. The sounds harsh, but I’ve been too wishy-washy on this topic and it has seriously cost me a lot of time and energy. I think that using what you want after putting in an enormous amount of energy to understand why your selection is objectively the best for your situation should be rewarded, not punished. After all, that’s exactly how Linux came into existence. ## Sunday, February 16, 2020, 2:49:06PM I did realize that there is still a compelling reason to maintain an active GitHub account: to manage forks and submit issues using that service. If I seriously fork something I can move it to GitLab (like the latest vim-pandoc-syntax-simply fork of mine). I am going to keep more of everything under my personal account (instead creating one of the very easy to create free groups on GitLab). I’ll save a group for when I actually have other people working on the project with me an want to vary the organization of repos and permissions. For example, the readme.world group will remain and I can put rw and any server infrastructure in there. S²OIL will remain as well as a place for utilities and documentation about lessons-learned from the initiative and community that can accept submissions and participation from a much larger group than just me and the community here. And, of course, SKILSTAK will remain a group there as well. God I’ll be glad with all this migration and cleanup is finished. I suppose this means my priority in that regard is finishing the basic repo tool for dealing with the repos on both systems, something the hub tool can never offer. ## Sunday, February 16, 2020, 2:08:35PM Just ran into another project — with a core library — that has migrated to GitLab. I keep running into these. Usually it is really informed technologists and teams that are doing it. The cview team is remarkable. They seriously know their stuff and they have decided that to use the gitlab.com/tslocum/cview package that you will forever have to be dependent on GitLab, not GitHub. This flies in the face of the #1 reason to keep stuff on GitHub: availability and community size. I’m feeling horribly conflicted at the moment. After my evaluation earlier in the year about why everything public needs to be on GitHub because that is where the contributors and recruiters are I’m seeing solid evidence that this is changing, slowly, but definitely changing. The conclusion I’m arriving at is that there are far fewer people on GitLab, but those who are on GitLab are by and large far more informed and intelligent about their decisions regard source management and even aesthetics. In short, they are more discerning and would rather face not being found than be on GitHub. So which type of community do I want to make myself a part of? You already know that answer. GitLab beats GitHub on every single point that matters. The only thing I have not yet tested is their GraphQL API. I’m going to sit on this decision for a week or so and talk it out with other amazing technologists on stream and in person, but I have a feeling I will be saying goodbye to the closed-source, cluster-fuck that is GitHub. I already feel less dirty for porting things over to GitHub in the first place. I mean, if you depend on being found in GitHub or playing with all the script kiddies there that don’t even understand why GitLab is so ridiculously superior then you have bigger problems. It’s not like my having bought Macs at one point because I thought it would help attract more students. It certainly did, but the ones it attracted sucked. Had I stayed Linux my student base would have remained more like what I have today. In other words, I’ve all but decided to abandon GitHub entirely, once and for all and to put a placeholder there that says, find me on GitLab. The End. Ya know, every time I allow myself to be influenced by a perceived perception management concern it is always the wrong decision. Picking the superior ideas and technologies and using them no matter what others thing is the only way to live. As for my people here? Well, imma let them decide for themselves. I’ll tell them what I’m doing and I’ll leave it at that. As for me, I’ll be keeping my stuff all under a new group named rwxrob and making a very distinct line between personal stuff and stuff for SkilStak. The ssoil stuff will remain it’s own entity. ## Sunday, February 16, 2020, 1:59:34PM I’ve confirmed the problem with sound levels being out of sync. My lights on this board were not even hitting the orange at all. And when I bump up the channel volumes to put them in the orange on the board they go in the red on OBS Studio. Which means that OBS Studio is bork and off, which is consistent with what Rairden was complaining about earlier. He’ll like learning he was right again. ;) This means the only reason to watch OBS is to be able to quickly hide something with passwords, which means I have one less thing that needs to be on the main screen. Huzzah! ## Sunday, February 16, 2020, 1:52:20PM It has to be said. Streamers who are putting they chat into their main video feed and who are not able to textually respond seem to be doing it the hard way. I know that Twitch is designed for streamers to be able to vocally respond to text chat on stream and that might help, but having HexChat on the screen set to “Always On Top” is much more efficient for the kind of live-coding streaming I’m doing because not only can I see the messages coming it, but I can respond to them immediately without even switching to show anything in the web browser. In fact, the only reason I even need the web browser at all is to edit the stream settings to change what people see when notifications go out. And I’m also learning that you have to stop the stream anyway for that to really take hold properly, if even for a short moment.I’m getting much better at quickly starting and stopping the stream so I don’t lose the connection with everyone, which also helps to split the videos on demand (VODs). ## Sunday, February 16, 2020, 1:37:15PM Reminded that my Twitch agent/bot needs to be able to regex search the titles from YouTube and list the URL for the video so I can pull it up instantly without cutting and pasting from YouTube. A sort of query language just for content of the different videos in the https://rwxrob.live collection. Come to think of it, as soon as they are all paired with their written content on https://skilstak.io I could just pull up those and make the bot use the rw search command to find hits. That would integrate the whole damn thing! Woot! ## Sunday, February 16, 2020, 1:15:53PM Arggg, the sound meter in OBS Studio lies. I know it isn’t probably the most scientific measure, but listening to music coming from the coffee music streams is at the right sound level and without changing any volumes I cannot hear myself on the news stream this morning. Actually I don’t think it is OBS Studio that is the problem. This board is way to underpowered to send an amplified signal to my device. Upping the main output volume would very likely fix this problem but the board is way to underpowered to do it. Hopefully the new board, that should work entirely through USB can handle all this much better because the return signal will be digital and not depend on board amplification. This crappy board requires an analog setup. It is possible that it’s not the board at all, that the digital signal coming in as a mic source cannot be maxed further. For now that just means recording at ranges that hit regularly in the red on OBS. We’ll see what that does with distortion though. ## Sunday, February 16, 2020, 12:14:45PM I got to thinking about how much information a person gives up about themselves to Google when they setup repeating, regular search reports that come as email. I imagine whatever heuristics they have for that stuff ranks such search terms with much higher priority further creating an “echo chamber” effect not only on all Google activity, but all the activity of anyone in your home (on the same IP) who goes to any web page that uses Google Analytics. It’s pretty damn 1984 with the goal controlling your buying habits, not world domination… oh wait, that is world domination. Of course that got me to thinking (in the shower) about possible solutions. The obvious solution is for people to create and control their own web crawlers. This is an idea that would have been dismissed immediately in previous iterations of the Internet and WorldWideWeb but is very acceptable today given current bandwidth and computer resources. “What if everybody did it?”. Well, that’s something we can’t know fully. It is certainly orders of magnitude better for humanity than if everyone traded in crytpo-currencies, which would destroy our current world power grid. There are several web crawlers out there that have come across my radar (none of which I can remember right now). One of the most interesting ones used Go concurrency to search the entire open web in 24 hours (was the claim). None of the crawlers (that I know of) cover deep-web, dark web, and README knowledge network, nor do they mine data from Newsgroups, specific tweets, and other crawlable knowledge sources. Yet these alternative sources are invaluable in all decision making and knowledge of current events. The Go-Nuts mailing list comes to mind. The opinions and information expressed there never hit the mainstream social media of Twitter, Reddit, etc. Yet that information is 10 times more valuable than anything from social media. So there is a very clear value proposition to creating a way for people to customize their own Internet keyword crawlers. A lot of the technology would overlap with crawlers/scanners that I already want to make to improve bug bounty hunting. Hell, the hacker community certainly doesn’t have any problem scanning the whole Internet all the time. Shodan.io regularly scans every device. Which is what I would envision. There is a lot more information to be had from the entire Internet than there is just the Web. In fact, the amount of useful data on the Web is diminishing as more and more people keep their information in their own communities and countries. I wonder what Aaron would make of all this? I wish he were here to lead us. There’s no doubt in my mind now that he was killed by the American government, which has said it would kill Snowden if they ever got a hold of him as well. God I hope I don’t suddenly show up in the news with a suspicious death like Aaron. But I’m no where near the threat he was. Aaron almost single-handedly defeat legislation in congress through passionate, non-violent activism. I certainly don’t have time to write such a tool right now, but after I finish rw and readme.world I definitely will. I wonder what we could name it, perhaps something to honor the memory of Aaron. Perhaps hil from his middle name Hillel. Now for a domain, the inexhaustible form of hope for projects I’ll eventually get to for the low, low price of$7 a year.

Humm, what if I used know.sh. Yeah, that might work. I’m not inclined to put a pretty GUI front end on it anyway, and with cview being a thing now that people could ssh into I would be directly rewarding people for searching for information in the safest, most private, most cost-effective way, through text. Yeah, I can see Aaron smiling about that idea. Sure he felt everyone should have access to data and I’m not saying they can’t, just that I want to build the entire infrastructure based on no GUI first and then layer an appropriate GUI on it after that.

In fact, maybe the main command needs to be know with a bunch of subcommands. Yeah, I really like that idea. Aaron will know it is for him. I like it to because it has all kinds of tag-line possibilities, like “in the know” and “did you know” and “be a know-it-all”.

I really wish I had a army of developers to help make stuff like this into a reality. I suppose I’m doing the right things to find them, through streaming and finally putting myself out there on YouTube. So far the talent and minds that is attracted to my stream are rather significant.

## Sunday, February 16, 2020, 11:04:09AM

With Pandoc 2.9.2’s release it is really clear that the project is moving toward a data-centric design and architecture. This is phenomenally cool. It is consistent with the reason I left Hugo (forming Hugonot and FADB and contributed to the TOML project) as well as the main idea behind my Hugo tutorial that helps create data-centric, data-driven Hugo sites.

## Saturday, February 15, 2020, 3:09:17PM

Two types of learning from others one-on-one (or a small group):

• Beginners teaching others what they just learned.
• Knights and serfs (currently employed professionals).
• Veterans mentoring what they did their whole life.

Teacher’s become facilitators making and managing the connections.

## Saturday, February 15, 2020, 2:31:53PM

GrapheneOS is an amazing Android-compatible, drop-in replacement for the OS on your Google Pixel that has had all of Google’s spying removed. I think I’m finally ready to go back to a smart phone now that I know it exists. This is the middle path between using the vendor’s stuff and the insane Librem5. It means I don’t have to throw out my Pixel hardware and can actually make good use of it. Just have to have a good SIM card.

## Saturday, February 15, 2020, 1:04:25PM

Really loving the Raspi 4. It can run a full multi-user Minecraft server without a hiccup (3 could not even overclocked) and it has full gigabit speeds meaning with a USB 3 Ethernet adapter you can make an amazing packet sniffer and put it on your home network to learn all sorts of cool stuff about how networking works (not to mention sniff any unencrypted password your mom, dad, or sibling enters without HTTPS). Who doesn’t want to do that? ;) The fact that it is hardware means that it can’t be detected on the network like an Alpha in monitor mode could be, but also means people would not regularly be looking at the physical location of your home Internet connection (but most don’t).

Also hearing about the Arduino Portenta H7.

“Run Tensorflow for low power machine learning.”

The ads for this thing are all pros using it in IoT settings with construction hats on, dual core processor at 480 MHz, hi-density connectors for infinite GPIO extensions. The thing is serious. Nothing says ARM == IoT more than a device like this.

Why do I care?

Because IoT is ARM. As much as I love Gooligum and MicroChip PICs for learning Assembly (because they are so simple) the true future of IoT is learning to code for devices such as the Arduino Potenta H7. Knowing they exist is the first solid indicator that this is where the industry is going. The complex PICs that MicroChip keeps putting out are not going to stand a chance against the power of low-power ARM chips for this stuff, even with MicroChip’s massive monopoly on the PIC market.

The movement of light TensorFlow crunched models to the endpoints is taking off like crazy as well, which means machine learning is key (not necessarily all “Data Science”, a horrible term and machine learning people hate as bad as “Full Stack”).

Just hearing about a device called the Sierra AirLink FX30, a $200 minimal 3G/4G LTE cellular gateway. I smell pineapple, mmmmm! Here are some fun facts about it that confirm some of my main unpopular conclusions: • No nano, just vi, not even vim. • Just /bin/sh which looks like probably ksh. In other words, if you are into IoT you really need to pretend you have one of these and only one of these when you make your decisions about workflow, habits, and muscle-memory. ## Saturday, February 15, 2020, 11:27:03AM Quickly mentioning the need I have had to start quizzing people immediately on their reading for the week to keep them doing it. I try to avoid any kind of quiz or test, but it is really the only thing that gets them doing it and helps me know if they even tried. If not, no worries, do it again next week. That means I need a way to quickly capture quiz questions for a given section of a given book, which works out because I’m already annotating the book so having some self-test quiz questions along with exercises is the obvious goal. Here are some for example: Introduction: * Why should you use the command line? * What is the book about? * What does it mean to “live” on the command line? * What are other ways besides command line to use computer? * What is Linux? What does Linux-centric mean? Why is Linux different? * What does Watts mean by “freedom”? * What is an SBC? Name one example. * What does CLI stand for? * What does HCI stand for? * What does REPL stand for? * What does GUI stand for? * What does TUI stand for? * What is a CUI? * How is a TUI different than a CLI? * What HCIs were used before terminals existed? * What HCIs might come in the future? * Which HCI is the most efficient (so far)? * What skills are required to use a CLI? * What type of interface is used for a MUD? * Who might prefer a MUD over a graphic game? * Who might prefer a graphics only game? * Why is it important that we thing about all users? * What does accessibility mean? * What makes “knowledge of the command line” different than “many computer skills”? * Which version of the shell should you learn? Part I * What is a shell? What do you do with it? What’s its purpose? * How do you start and use it? * What is the name of the shell that is default on Linux? ## Saturday, February 15, 2020, 11:06:42AM Being reminded that working with some people requires they know exactly what they are going to get and generally how fast it will take. Unfortunately this is impossible to state exactly for any one person. The best I can do is provide the “fruit” of their learning and direct them up the tree at each branching node toward the specific fruit they want. Here’s a rough summary of the most in-demand fruit here (in order of popularity): 1. A Minecraft server 2. A Game. 3. A tech job. 4. A Web page. 5. Hacker skills. 6. An App. I’m feeling the need to return to a friendlier diagram of the path to get to these specifically, something with a bit more color. The question is whether to make them all on one chart (as I maintained for years) or on separate charts and just how informal and fun to make it. To date everything has been black-and-white and very formal. I’m not a fan of kitschy forced graphic art and infographics, but maybe it’s time. Maybe I could hire the artists I know to pretty-ify it. This week I need to have everyone declare their path (again) and I need to completely delete all my writing about “first stack, full stack” because honestly, in practice it doesn’t work. I’ve now has a year to full test that approach and I can say pretty clearly it fails. It tries to maintain too many skills concurrently. Instead, the focus not is chopping everything that is not critical to the specific path and seriously isolating the core skills that everyone needs. That means terminal mastery for everyone like before with skilstak.sh fully restored. It means allowing some to drop if they are not interested in learning the very most important skills to learn as I see them, which, more harshly stated means, “Learn terminal skills or get out.” This is especially true now that I have to turn away potential people I could take on one-on-one to teach from the stream over those who might complain here locally. Streaming has been a really fortunate discovery for this very reason. ## Saturday, February 15, 2020, 10:59:28AM Finally found how to silence the volume up and down sound. It’s under Sounds tab of the Sound system preference window. I turned everything off since I’m streaming it all at any given point during the day. Just taking a note so I don’t forget. ## Saturday, February 15, 2020, 10:34:13AM God I get triggered by people who put their religious affiliation as the first thing in their title like this: “Christian. Hacker. Pirate. Ninja. Author. KC3FRD. Hackers for Charity.” No one gives a shit. If you feel some guilt to your God and creator to make sure HE gets top billing so be it, but don’t force all of us to read it. Sure it’s your feed. You are free to do what you want, including being an asshole. Ironically the people who do this are almost always Christians. You don’t see Quakers, Buddhists, Jehovah’s Witnesses, Muslims, Jews doing this (okay some Muslims do put something praising Allah into everything I suppose). The point is don’t. Anyone who actually gave a shit and treated what you have to say differently because of your professed belief in a particular religion (instead of proving your piety with you actual actions) is severely stuck in the worst kind of echo chamber. Think of it, “Hummm, this person is Christian, so I better listen to them even more than the others. Oooo, he’s wearing a cowboy hat, he’s my kind of people. I’m so glad I found him!” Sadly that is exactly what they think. No wonder we are in serious trouble. The only thing worse is listening to a Fox News anchor more intently because she’s just so damn hot! Just ask Meghan Kelly who had to “twirl” in front of Jabba the Fox Blob just to get the job despite her Ivy league college law degrees. By the way, this is a great example of when the one true mantra and commandment can be applied: “What if everybody did it?” Asking that question immediately reveals how stupid it is. What if every single person on Twitter listed their religious affiliation at the first thing in their description? How about the Christians who do not? Are they less Christian because they didn’t? Is the reason you feel it should be the first fucking thing we read about you serious enough that you silently judge the rest of the world for not listing that they are a Christian very first? It’s like a really bad Black Mirror episode where all those who don’t profession their Christianity every time they fill out a profile are condemned for being too apathetic. It would be funny if it were not exactly what the Nazis did under Hitler. Not wearing enough Swastikas? Hummm, you are suddenly watching the people following you around ever corner. I know this seems like just a little thing. “Dude, why are you freaking out?” Because it is little things like this that trigger me because of the underlying dangerous psychology that promotes doing that shit. Some place deep down this asshole judges everyone who doesn’t list their religious affiliation first, but judges them even more if when they do so they are another one. And no, it’s not like listing the proper pronoun to be used. As annoying as that is listing he/him is practical because (as I’ve encountered on stream) you simply don’t know what pronoun to use for a random person on Twitter or Twitch. Sigh. ## Saturday, February 15, 2020, 10:24:48AM I got an email (again) today about the fight against the Alaska pipeline and it always makes me wish I had unlimited time. I like to think I’m putting a lot of good out into the world and I’m making the best use of my time here but I always feel like I should be doing more. I didn’t always feel that way. At one point I felt that if I was doing my Mormon church “calling” and taking care of my family (the way my wife at the time thought was acceptable) that I was good, that my contribution to humanity had been fulfilled. How wrong I was. People who get in their little insulated bubbles of religion or community and never look outside at what’s going on around them are almost a physical manifestation of Plato’s cave. I never ever want to find myself there again. Why the fuck do I have to be 52 to really realize this? (To be fair to myself I knew it all along and only acted on it in my late 40s.) No wonder old people are always looking tired. They have less energy and more awareness, which is a frustrating combination to be sure. ## Friday, February 14, 2020, 9:19:46PM I be coding on a Friday night, but only for a few super retro songs while cleaning up this fork of the vim-pandoc-syntax plugin. So happy to be free from that other maintainer. I have no ill will toward him, just so fucking happy opensource allows me to steal it and do it my way. ## Friday, February 14, 2020, 8:40:36PM Been playing with the Twitch rerun thing and ran into a forum of angry people insisting that Twitch is “live” streaming, and that people “scamming the system” with reruns need to be dealt with. I tend to agree. My original motivation was to make it easier for people who actually might not now how to get to the old videos, but honestly everyone pretty much knows that. I have run into a few people who don’t even know that I have videos or a YouTube channel with the best, but that is not really an issue that I can overcome other than inform them about them on the live chat and remind them they are there. In other news, I actually did a full highlight video edit session on the live stream with zero problems with OBS. It was rather “meta” to use the word of one person in chat. They were giving me real-time feedback on clipping and editing the video while I was doing it, so weird and wonderful. Tonight I’ll be turning the stream off for Valentine’s Day. Also, I’m not going to be leaving the stream on over night anymore. I’m only going to live stream when I’m actually doing something even if it is just drinking coffee and reviewing the news. ## Friday, February 14, 2020, 6:34:56PM So the Internet won! Championed by Louis Rossmann the horde of IT people defeated the corrupt CompTIA that gave up it’s lobbying against “right to repair” (and pushing its obvious agenda of forcing everyone to buy their certification products). It is like the real-world version of either UEFI or Tivoization, take your pick. ## Friday, February 14, 2020, 6:18:57PM I fucking hate Node. Sitting here watching this bloated piece of shit install for five minutes making my system entirely subject to NPM Trojan worm attacks, but still I have to have it for Browsersync. I really need to port that entire codebase to Go. It’s such a natural application for it. Installs would be for a single, tiny executable instead of pulling in all this Node shit. ## Friday, February 14, 2020, 3:36:04PM As I’ve been cleaning and consolidating existing curriculum outlines I’ve been conflicted about where to put it. Do I keep it on skilstak.io or do I move it to robs.io. The answer is probably SkilStak but I sometimes wonder if that is going to reduce the appeal of it just being me, which is exactly what skilstak is. People don’t often realized that skilstak.io is just my one-man company’s web site. There’s a dilemma that exists between presenting substance through one’s web site, and misleading people to think it is more than it is. If I just had a personal site up I don’t know what the response would be versus something that looks more formal. At the end of the day, SkilStak.io is just a place to store my education content. Yes, it’s a company, but the company is just me. I imagine that would be clarification to the Twitch Live Coding team when the time comes to apply. Perhaps that’s enough to keep things authentic. I do put that on the page. However, I had been thinking of making skilstak.io less first-person and more suitable for using in a classroom setting since that’s one of my main goals. I don’t know why I obsess about this stuff so much. I think it’s because I want to document a good model that other potential one-person private mentored community leaders might want to follow. If I can help them then the obsession has been worth it. That’s also the reason I so obsessively writing everything down. My memory is horrible and someday I want to capture this experience and insight — including the trivial stuff — is some kind of guidebook. The more people who choose to mentor and the more people who are helped by mentoring the better we all become and more broken traditional models will fall. In other words, this is how I do battle against a broken system. One thing is for sure, robs.io will be where all of this sort of personal pontifications goes, including the not-so-infrequent f-bomb. That will already SkilStak better. I suppose that means SkilStak will become a second-person knowledge base with not references to specific author things. When I a callout (those things in different colored boxes) I’ll be sure to include the name related to it. That way I can include anecdotes and such from other others, like William Shotts. The hard part of all this is that all of the material on skilstak.io will therefore need to be re-written. But frankly it needed that anyway. Tech content is rarely done. It’s constantly in a fluid state. So I really need to do is revised and update the content, decide if it still belongs, and change the voice. Then I can the search engine and index page back on. (Currently they are off so people don’t find all of it, hundreds of README modules.) I won’t lie. The amount of work is overwhelming when considering how much time I already spend in real-life mentoring sessions and now with streaming, and the coding required to make the tools, well, let’s just say, I don’t have time for Witcher. ;) I will say this though, it’s far more rewarding now that the stream is with me. The feedback and appreciation is very motivating to the point of obliterating any addictive desire to play through Witcher just one more time. I does mean, however, that a lot of the reading I have queued up to do will either have to wait, or be something I stream while I do it. I suppose streaming the reading isn’t bad. I certainly won’t be polished, but that will help me put structure around the stuff that needs a more formal video created. ## Friday, February 14, 2020, 12:04:36PM Just saw the news notifications come in for the day and it got me to thinking that maybe the best way to cover news is just to talk out loud about it as I read it every day, which is something I always do. Let’s be real, I’ve never going to actually get to the “News” video that I want to do for the week. Why not just read the news and drink coffee with the stream? Well for one, it means really letting the stream see all my email. But how bad is that? There’s really no one being doxed in my email. So I don’t think that’s enough reason not to do it. Humm, but what about the parents of people who come? Their emails are not exposed unless I open them. But everyone can see their first names. In past everyone has been okay with people their first names, which incidentally means showing my reservation calendar is also not doxing anyone. I suppose the question is should I wait for subscriber only streams do to all of this. That would certainly be safer. I think that really is the answer. I mean, I don’t want to be exclusionary, but having only subscribers at some level able to just chat and do news and stuff is probably worth it. It doesn’t stop people from participating in the IRC (which is another great reason everyone should be using IRC). But it does allow me to limit the video and audio stream to just subscribers. I think there’s a subscriber tier thing so perhaps just the lowest could get people access to absolutely random me as well as some value in analyzing the news of the day immediately rather than waiting. One thing’s for sure. I’m just not going to polish up a news video like Weekly Weird News. I just don’t have the time. That is their full-time job. ## Friday, February 14, 2020, 9:21:55AM Streaming has got me reviewing all my written content and curriculum (again) and adjusting based on new priorities and focus. I suppose this is no different from any other time over the last six to seven years doing this. In fact, everyone really should reevaluate what their doing, why their doing it, and how to get better pretty regularly. I imagine those who don’t find life rather challenging. Somewhat ironically this is how you get — and keep an occupation. The word occupation is so much better than job or even career but it still has something of a negative connotation. It’s not occupying your time, it’s about find the thing that will make time fly by and making every decision in the pursuit of that. Yes, it’s about bliss, but it’s also about focus. And it seems focus is in short supply these days. So what is my focus? It goes without saying that my focus is on my health and family. There’s never a need to really say that. In fact, those who are most likely to put stuff like that in their title and comments and Twitter descriptions are usually most likely to not focus on those things. It’s like a need a reminder or are afraid that if they don’t spell it out people won’t know, when in fact these things are obvious and need no calling out. People forget those thing are for communicating to potential followers what they might want to learn or read about you. And usually it isn’t about your religion, family, or nationalism. Anyway, where was I? So what’s my focus, at least now that I’m streaming? And how should it change, if at all? First, I’ve concluded that producing videos and live-streaming are the biggest changes because they are immediate and reach more people potentially. So I’ve decided start live streaming everything. People can filter out what they so long as I describe well what I’m doing and when at any moment. So might actually want to write or read these blogs with me, others might just want to tune in to watch me struggle through learning Rust for the first time, some come for the community more than me even. My stream just happens to be a hip place to hang out. There’s something very real that only partially understand about what makes this sense of place a thing. But it definitely is a thing. Humans are complex. I feel like too often I have understand why they do what they do, but really I just accept it and act on it. So, making myself available on the stream, putting myself out there (as some have asked) is really the first step. That only alone is scary enough because I’m bound to make stupid mistakes that everyone can see. Maybe that’s the attraction. People wanna see humans being human. I wonder if that is related to why sitcoms of horrible family or “reality” shows do so well. Either we want to feel better than the imaginary family or want to feel like we belong. This is why I’ve known so many people, some shut ins, who buy stuff from QVC. They obsessively watch hours of QVC craving some sort of artificial connection. But stream is a real connection. It might not be in person, but it’s real. I think I’ve just discovered a new thing. Drinking coffee off camera with just a mic and some nice music while processing all the new info from the night (I always wake with ideas) and getting myself moving and prepared for more during the day. Who knows I might actually manage to get a schedule again, but I refuse to write it down. Having someone else read and expect a schedule is very risky. No, I’ll just be better about making sure notifications that Twitch sends are accurate. After all, some people use their phones for this stuff and might even be woken up the email notification sounding off and some early hour. This blogging and talking thing is also a thing. I was concerned I was going to lose blogging and writing, but why not combine the two. Being able to take live questions and life at real-time commentary makes this streaming thing so much more powerful than Twitter (although Twitter has it’s place for broadcasting and finding new people you want to know). Someone once told me, “Facebook is for staying connected to the people you’ve known. Twitter is for finding and connecting with the people you want to know.” (It was something like that.) So what to do about the change in focus. It has definitely already affected my priorities for the better, but how? It’s probably a good thing to thing to separate creation of videos (Twitch highlights) from live-streaming. In many ways they are very different and serve different purposes. Videos are equivalent to what I’ve been calling README modules or articles. In fact, I plan on embedding every video into a specific written module. The video will eventually be secondary to the written content, but the video will come first. One of the things that’s really slowed me down has editing all the modules for the different topics and skills I need to cover. So I’ll do the video first and eventually get to writing it up. I have hundreds of videos to do, if not thousands, all very small, all covering very specific topics or steps, some are going to be particularly hard to capture on video, which is why a good capture card is so essential. For example, creating any video that has to do with booting something, installing Linux, working with a Raspberry Pi. A capture card covers all this. It’s becoming clear that moving to a remote subscriber model for the mentoring I’m doing probably is the right way to go because I can reach and help more people. After I wouldn’t be doing this if my goal wasn’t to help people. I’ve blogged before about why subscriber education seems best as opposed to Udemy videos where you don’t get the community along with the videos. I don’t understand why people don’t realize that one greatest attractions of any educational community is the community. For example, people remember their college friends. They make connections learning together. Few things are more satisfying. If the goal of life is love and learning then the community is the most important part. Udemy and other such videos create an artificial sense of community but not an actual community. Subscriber streams — specifically on platforms like Twitch which use IRC — are fundamentally built on community. That is the building block that so many other online learning platforms are missing — including FreeCodeCamp that claims to promote it but fails to facilitate it nearly as well as Twitch does. Once again, gaming leads to mainstream gains. It’s the “Disneyland Affect” all over again, which reminds me I really need to write that up more formally. I did find my old, deleted Mo Hax blog the other day. I might resurrect some of my best articles written for that. After all thousands participated in the comments there. I wonder how I would copy over the comments there. The topics were surprisingly relevant to my conclusions about Twitch streaming including — above all — the importance of fun and community both of which promote the creation of dopamine in the brain which is proven to promote learning retention. This is why gamified learning is such a powerful thing. It’s why we learn to play games so quickly even when the interactive tutorials are horrible. Well that’s a lot for this morning. 1. Continue live streaming everything (well not everything). 2. Prepare specific videos to go with material. 3. Don’t get bogged down by perfection. 4. Prioritize what people need most. 5. Create specific paths through the content. That does make we wonder if eventually streaming Mr. Rob doing Yoga might be strangely interesting to some people, which also reminds me that I need to figure out what is better: run a different stream, or change the category and make sure the notifications are clear about the stream. So for example, yoga, I would change the category temporarily. Perhaps when playing a game as well. I need to figure out the protocol and expectation for that. ## Thursday, February 13, 2020, 5:54:09PM So Hexchat is definitely the IRC I’ll be suggesting everyone learn seeking their PTP from me. It comes by default on Mint and is the standard way to get help from the community for anything related to Mint or Linux in general. As sexy as I personally fine WeeChat and IRSSI to be they are simply not as empowering. In fact, the only power they give you is to use an IRC bouncer and participate in chats completely anonymously. There are enough ways to do that using VPNs and Tor that allow the use of HexChat that the GUI benefits of seeing when another channel has new information strongly outweigh the benefits of being terminal only. There, see, I’m not a terminal bigot. Right tool for the job. Being able to set HexChat to always be on top and visually get all the data I need so quickly would be really hard to match without a lot of WeeChat customization. And even though WeeChat supports mouse interaction at that point you have to seriously ask yourself why you aren’t just using HexChat in the first place. So I guess I need to review my list of stuff from the PTP chart. ## Thursday, February 13, 2020, 5:29:58PM I inadvertently turned on the JavaScript console while browsing just to push over the rendered screen a bit (while streaming) and noticed Medium sends messages to the console specifically for developers with a nice bit of ASCII art. You know the obvious conclusion here, I will always browse everything with my JavaScript console open from now on. Seems I wasn’t the first to hide Easter eggs there, and of course I wouldn’t be. It’s a natural way for geeks to connect at a basic level. ## Thursday, February 13, 2020, 5:00:30PM After jumping through all kinds of hoops for a GitHub action to pull down and configure a full Python 3.7 instance just to run Vint, just to lint the vimscript for the vim-pandoc-syntax plugin I’m really triggered about how completely wasteful having a full action setup for something as trivially simple as linting is. In fact, the entire value proposition of such a thing can only be realized on projects where the different number of contributors is very high and their experience varied. Call me old-fashioned, but what is the difference between an “Action” and just running the damn linter from the command line before you even save the fucking thing? I think I know the answer: to enable continuous testing and not have to setup the testing every time. After all that the central promise from the whole CI/CD thing. But I can’t help thinking that like everything DevOps these days, people are being stupid because they aren’t challenging the approach on every project and just doing it because that is what everyone does — or worse — that is what will get them a job or a new account. The catastrophe that has become the massive cluster fuck of micro-services today — culminating in “mesh solutions” and the (re)invention of (g)RPC — is the natural result of all of this. Sometimes I swear people forget to use their brains instead they read the latest Gartner Group report or use whatever has the most GitHub stars. We talked about Deno and Ryan Dahl today and all of this is exactly why I love that guy so much. He sees things practically and constantly challenges everything including his own massively popular creations. We need more of that, we need more Ryan Dahls. ## Thursday, February 13, 2020, 9:15:34AM More conclusions: ## It’s okay to have kids use VSCode. Those who prefer it can still use it, but I’m going to make another very clear map of what they cannot do until they learn vim. ## Need a fun three with fruit on it. This way they can see what they get when they climb out onto a particular branch. ## Need to find or write some good mind mapping software. So far I cannot find anything that has a decent outline conversion. Inspiration was so good but they shut their doors in November 2019. That was a pretty good run for them. I used them in 1996 a lot for all my college work. ## GPLv3 all the things. After reading a lot more about Tivoization I’m finding myself siding pretty strongly with the FSF, which is a bit odd for me. In fact, GPLv3 is the only thing that even attempts to take on the fight against the movement toward devices just being extensions of services that we don’t actually own. If a company chooses to be that shitty I want them to at least suffer from not having access to all the great code that I (and the rest of free software developers) are making. They can put shitty Zsh substituted to avoid giving back their improvements (as Apple did) but they will always suck for doing so. I have no problem with someone burning an encryption key onto a board and making their device not work if someone tampers with it, so long as they provide the key to the end user so they can do what they want with it. Not only is that appropriate, it is the only fair solution and covers every FUD inflammatory case (medical equipment, satellites, air traffic control). In fact, the government is actually likely to want Tivoization in order to assert dictatorial governance of the market and ensure they can active and spy on citizens however they deem fit. This is just not okay in any country. So Stallman (you controversial, industry-changing fuck) I’m with you on this one. Linus can go fuck himself. Torvalds has always been completely clueless when it comes to the real legal matters at hand. (Probably had several governments and companies approach him.) I seriously doubt Torvalds has even read one court case brought which the FSF has been constantly involved with to determine their course of action. But that asshole has the gall to through shit and FUD at the FSF suggesting they lied about what was in the license to entice people to use it, instead of creating a new license. No wonder not a single person in the audience raised their hand when he asked. No, Linus is just old and clueless, brilliant, but clueless. Yes I think a new license would have been better, but there were so many changes and clarifications to GPLv2 that it was the natural best place to add the Tivoization clause as well. There was no sneaking around. In fact, after having been slightly on Torvalds side and then fully reading the full arguments from the legal FSF team I think Linux is not only uninformed, but, to use his words, “a fucking moron” on this topic as are more of the “open source” people who just want to MIT license all the things. I keep finding that people who truly dig into the issues almost always land on the side of the FSF, but hardly none of the developers I run across have even read any of the licenses involved let alone understand them. This is how Facebook got away with their React shit. ## Wednesday, February 12, 2020, 1:32:30PM Would it be okay to entirely drop the Senior Software Engineer track? I mean, it is implied in Senior Cybersecurity Engineer. Let’s face it, there is not time to learn Vue and all the sysops stuff for most people before they leave to college or wherever. I’m thinking a conversation about what Vue and React are, and what PWAs are, is as far as I take them. All of those tops are a good year of focus for most. Eliminating the extra web stuff allows them to get into Golang programming more. Since the shift almost no one has been able to make time for Go and shell. Then again, most of them do no coding during the week but while they are here, something that continually annoys me and motivates me to drop them and seek others who are seriously focused on cybersecurity, admin, and ops. The streaming has revealed that there are potentially people out there to follow — and pay — for those spots vacated by the once-a-week game developers in my community. There’s still a lot of consideration before making that decision. Perhaps a waiting list for those online who might want to do it. But knowing that it the eventual direction really allows me to focus on the specific content needed for my ideal group of community members. Instead I could approach the web development from two angles, how to quickly publish results are participate in the README World Exchange of knowledge (blogs, etc.) and how to break into them. That means focusing on HTTP, curl and such and less on image formats for gaming. I mean, the material is there for those who want to continue on that path on their own, but I could laser focus on the Cybersecurity and Linux certification above all. That really feels like the right thing to do. I’ve chopped so much, but chopping that makes so much sense. In fact, following on the idea of building on existing content rather than creating all my own content I could piggy back off the WGU focus and alter it, injecting it with LPIC instruction and preparation for OSCP and one or more network certifications. By opening focusing on these things from the beginning — and stating so up front — I end up with a hyper-focused community on what I believe is not only the most employable tech occupation, but the one the world most critically needs. In fact, I could see what is needed for these people to get the certified web developer certification that WGU requires and focus on them getting that if they want just for the paper even though they would far surpass their requirements. ## Wednesday, February 12, 2020, 1:23:59PM After deciding to go with a new skilstak.sh I’m realizing a lot of other changes I had in mind are now very easy. Perhaps the biggest is that new members don’t have to have their own Linux system. They just have to have access to a computer. That’s it. This opens up the possibility to offer mentoring in some way to others who are just starting out and want to learn the shell without setting up a Linux machine at first. This motivates people to learn Bash enough to customize their own bashrc files. In fact, I could provide the basic one and then require them to customize their own as an exercise. I love this because my modular refactor of my config fits right into this because they can copy what they want incrementally into their own. I’ve also concluded that my fear of having people become dependent on the server (as opposed to their own) is unfounded. History has shown that everyone starts up their own server as soon as possible once they have the skills. It’s like the drive many have to build their own computer instead of use the one provided to them by the school lab. So I need not fear an imagined dependency on SkilStak. Hell, I have so few members I could offer an unlimited account for veterans who reach a certain level. Nah, they would have their own server at that point. ## Wednesday, February 12, 2020, 12:30:18PM Had a night to sleep on the new direction and a good long walk with my wife discussing it. The main paths to Senior Cybersecurity Engineer and Senior Software Developer will remain, along with the common foundation, but the question remains as to whether to allow multiple paths through the same initial content or to require everyone to take the same consistent path mostly for practical reasons, not particularly because that is the best path. Every school on the planet pretty much creates one consistent path for everyone, but I have the unique opportunity to vary that path based on the individual. However, when I do vary the path it seems like after the person completes the stuff they are interested in that they are never motivated to do the important stuff they need. This simply prolongs a situation that might have been detected and dealt with from the vary beginning. In other words, I need the filter back that I once had, and that filter is command line terminal for everything. There is simply no better filter for those who will eventually decide this isn’t their thing. And let’s fact it, I don’t want to waste my personal time helping anyone who doesn’t agree with the essential need for these skills, the most fundamental of which is the command line. There are a lot of honorable, needed occupations that do not emphasize the command line, I would just rather focus on those that do. The end. I’ve all but decided to drop the$30/month static IP and rebuild the original skilstak.sh remote cloud system for everyone. My highest quality people came out of here when I used that approach and the ability to log in from anywhere really drove their learning to new heights. Anyone with SSH and a terminal could do anything and didn’t have to stress about saving it to GitHub or even setting that up initially. I think I’ve underestimated how significant that barrier is for absolute beginners. As soon as we have to do ssh-keygen -t ed25519 their eyes glaze over. But if they have had time to get used to a remote command line — that is very consistent — when the time comes to learn that command and setup GitHub they are like, “Oh, so it’s just another command to add to my collection.”

This indirectly makes a statement about VSCode as well. While it addresses the same hurdles it presents a whole bunch of its own. But unfortunately the hurdles VSCode presents are very specific and overcoming them provides no additional value to other endeavors.

I’m reminded of the absolute hay-day that was SkilStak during the era of command-line first. Our first lesson was how to use ssh. Our second was how to navigate the command line. Third was how to edit files remotely from the terminal. This initial focus on the command-line clearly set the mood and expectations for every single class to follow — for the better. I need that back.

So, what’s the plan? Simply to add skilstak.sh back in full force and to train everyone up on it immediately. We will divert from our current Web focus to terminal CLI skills — following the Linux book rather closely — and return to Web stuff up on the server. I’ll provide a web server for everyone that does not require Git and people can preview their development immediately as they do it.

The biggest challenge that remains will be the transferring of images up to the remote system. But honestly, that is a matter of learning scp which they have otherwise not had a reason to learn up to now.

REPL.it still does not remain a good options because the ability to customized your bashrc file simply does not exist and — most of all — REPL.it requires that one have a web interface.

This feels remarkably good and will ridiculously simplify my next lab rebuild. Come to think of it, the nature and availability of high-quality terminals now for all operating systems has removed the initial motivation for using the colored VSCode terminal completely.

Most importantly, however, this focus on the ability to be productive with nothing but a command line is the core value and skill that SkilStak has always been about. Anything else — no matter how justified by specific occupational pursuits — simply doesn’t belong here. The ultimate priority is on pentesting, systems administration, SRE/DevOps, and Full-Stack web development which requires robust knowledge and skill with the backend command line interface.

A few things will be significantly different:

• A password longer than 40 characters will be required. Public keys are safer, but they are less flexible and using them is more likely to cause a beginner to leave their private key cached on some unsafe computer from which they connected. This is a strange, rather rare situation where logging in with a password is safer overall than using keys. I’ll go forward with my security auditing plan to ensure guessable passwords are cracked and I get notified. In fact, since no one will die, I’m tempted to give visibility to the /etc/shadow file to everyone to practice their cracking skills. It will seriously motivate people to maintain good passwords. It will instill the proper paranoia that we should always expect someone to have access to that file and be able to brute force it, ’cuz they will. This is one example of something completely counter-intuitive that no one else would ever consider, but emphasizes learning skills and habits of highly-skilled cybersecurity professionals. It will also seriously encourage people to login often.

• Everyone will get their own subdomain. I have a small enough community now that there is no reason not to give them one. Eventually they will move off to Netlify or their own subdomain but that is secondary now. The primary priority is on command line navigation and development.

• Members will be encouraged to hack other members (within reason). They do it anyway every time I set this up. So encouraging it and managing that expectation will cause people to behave more like they are going to DefCon and be really prepared and safe with their content. I’ll have a hard and fast rule about destructive hacks, but nothing about being able to hack them and leave their stuff someplace else for them to recover easily. This will light a fire under those who are intrinsically attracted to cybersecurity and specifically to pentesting. If the risk of being hacked and the “trouble” to protect yourself is too much for anyone, well, they probably don’t belong here in the first place.

• Everyone will have a web server. Automatically everyone will get a web server and a range of ports then can use to bring up their own services for experimenting with tunneling, websockets, gRPC, and more.

• Keybase integration. Keybase will be the common way to share files between members and a very easy way to add images and such to the system without having to necessarily learn how to use scp (which we will still learn).

• Significant, ongoing capture the flag games will be emphasized. There is nothing more fun that this. It is what shapes their interests. Even those most interested in creating games of their own are overwhelmed so much by Easter eggs and hacking real life games that they can focus on little else once they start. There are my core community members and I want more than anything to free time to focus on this stuff.

• Web development de-emphasized. There will still be web development enough to get the WGU CWD certification but nothing more. A fundamental understanding of web pages and such is essential for everything — but especially for cybersecurity and bug bounty efforts.

## Tuesday, February 11, 2020, 6:12:07PM

I’m a little conflicted now on what should be the first JavaScript book, or even the first book period. It seems like three main paths and types of learners are emerging based on the project or thing that the person most wants to get the fastest:

1. Video game.
2. Minecraft server.
3. Web site/app.
4. Job in tech.
5. Hacker skills.

I’ve had multiple paths before. All of them include the same material just in different orders. This makes me question the preferred path I outlined in Modern Technology Foundations.

I think my experiment bypassing the command-line skills I tried with one person this last few months blew up because the person ended up really needing and wanting them once they figured out how much easier they are than using the GUI to move things around.

The question is do I maintain my focus and requirement for everyone to get on Linux first? On the one hand you have the solid experiences providing low-hanging fruit, but on the other such experiences can give a false impression of the actual work involved to become a technologist with the fundamentals I maintain are foundational.

However, back when everyone had to learn shell by logging into skilstak.sh and even learned vi before anything else I had tremendous success. Every one of those people who survived went on to do amazing things including full employment at 16 in technical jobs. Did that approach provide a filter, much like an intense Chem or Physics 101 does? I think it did. If so do I still want to keep that filter in place? Or do I soften everything to those who might never need Linux and only minimally need the command line.

I think I know the answer, I just am having a hard time admitting it. I really just don’t want to deal with people who are not naturally attracted to Linux and the command line. That’s not bad to say. It’s not evil of me to only want to work with people who intuitively understand why this is objectively true. Plus it is my sweet spot. People who want to use VSCode their whole lives and play around with React and web graphics and — for God sakes only want to make a game with GameMaker or Unity — can simply find some other way to learn that stuff.

In short, I want to further focus specifically on the following:

1. Linux
2. Pentesting
3. Bug Bounties
4. Minimal JAMstack Web Sites and Progressive Web Apps in Vue
5. Physical Computing

I fucking hate game development and always have. (Actually I wanted to do it when I was 12 so that’s not completely fair. I must never forget that). I always teach game development as a means to an end. But I’d rather help people setup Minecraft servers all day and watch them become kick-ass system administrators and get Linux certified at 15 than have someone create the next fucking Fortnite. And there it is, the truth.

The disappointing reality of my initial effort teaching GameMaker is that my 3rd student ever never amount to anything more than a crappy GameMaker user who threw off even GameMaker language and stuck with the pretty GUI interface. It was a serious mistake to let that go on for as long as I did. At the same time others of his age were way into Minecraft and went on to do seriously amazing things, one even teaching himself Assembly.

The sad reality is that if someone is fundamentally attracted to making games (and little more) I’m never going to be able to entice them out of it and frankly it’s not my place to do so. People are people. And the world clearly needs games, far more than reality television. Games are art. I’m not attacking games. Just my frustration with people who are obsessed with nothing more than making them. It’s tough because just the other night I med a brilliant Assembly programmer who got into because he wanted to hack NES games in Assembly. I want to believe those skills will eventually be applied in other places as well. But I’m not going down that road. Not when I can create gamified hackathons and hack-the-box puzzles related to real skills that are our world so desperately needs.

I’m super triggered tonight because I actually let myself give in to downloading GameMaker for a brilliant person here and trying it on Linux and suddenly realized, “What the FUCK am I doing?” I rather directly said, “You have two options, this book or the other. You pick. That’s it. If you want something more or don’t find this stuff interesting it’s time for you to leave.” (Yes I actually said that.) That’s pretty damn harsh of me. But it worked and he actually enjoyed the new book.

I’m not normally like that. I’m just tired of having to be a creative marketing charlatan to get people to come around to doing what I hope will amount to skills that can help us all. This is compounded by the fact that I have several members of my community who do understand this stuff and seek from a very young age to make that difference and have. I really wish I understood why that is. Best I can do is curate a community of just such individuals. After all, I started this community and I can decide who stays and who is encouraged not to return.

Streaming has been great because it further proves that such people are out there, many of them are in college, which makes sense. Others have lived a fair amount of their lives and come to the same set of priorities that I seem to have. So perhaps streaming is the answer to my frustration that I’ve been seeking.

GameMaker can fuck right off.

Just realized that when I have a capture card I can capture the BIOS setup and installation details regarding UEFI installations for all the different computers I have around the lab. There’s a good chance one will look what the viewer has at home. I have a pretty good sample of BIOS types, MSI, America (whatever, the blue one everyone has), Dell (which is rather unique), and even an HP. Hell, I can even capture how to do it on a Mac Mini. I think that variety of BIOS setup videos is easily worth the $150 or so that they cost. YouTube has very few (substantial) videos that actually show working with the BIOS through this challenges, which continue to be the biggest hurdles for more people trying out Linux installations for the first time. So now the question is which of the capture cards is best for this type of thing. A lot of the recommended ones are not generic enough. I think I am sold on the idea of having component in as well since that allows the capture from even old video cameras, of which we have many in the lab. ## Tuesday, February 11, 2020, 5:47:10PM I am really liking this Eloquent JavaScript book. I have someone working through it with the console open right along-side the free, creative-commons book and tinkering with the concepts as they read them. That’s phenomenal to me. I also read ahead and noticed that the main project in the book is a nice platformer in which the reader codes their own gravity and collision, which is something I always like to do before giving them a PhaserJS or other library crutch. I think I’m really sold. I think I’ll recommend this first (before the Learning JavaScript) book for most, but stick with the main recommendation of the O’Reilly book because it is quite a bit more thorough (but nothing like Definitive JavaScript). Eloquent JavaScript appeals more to younger coders as well and most of those with whom I work are rather young. Plus they can say they literally coded a game from scratch. Plus the writing and references in Eloquent JS are hilarious and the author doesn’t shy away from math and horrible puns and metaphors. I really quite impressed. ## Tuesday, February 11, 2020, 4:34:31PM Just hearing about Pop!_OS from System76 and it looks encouraging. I’m so not a fan of that company having had a horrible experience with their customer service in 2013, but perhaps they have matured. Their computers are still ridiculously overpriced and their return policies are frankly unacceptable in today’s market. (They charged me like$150 for “rebox-ing” and “scratches” after the laptop I purchased turned out to be ridiculously underpowered for the price.)

Evaluating disto installs and such prompted me to start looking for a way to capture the output to the monitor and stream that to OBS since anything that goes to the monitor includes all the BIOS and other things. It would be the best way to stream such sessions rather than sticking a camera in front of the monitor.

Looks like there are several brands and they all do exactly what I’m thinking of because they are used to capture screens from consoles and such. This is frustrating because I know I need one to make the installation video for Mint to capture the BIOS stuff.

## Tuesday, February 11, 2020, 2:00:47PM

Trying to capture all the streaming tips I’ve been learning as I do but man there are a lot of them.

Just completed my first thumbnail after giving it a lot of consideration and it is not fun. I would much rather be producing content and code. It could almost be automated, but I’m afraid there is no way to create the initial screen capture required so it is just faster to make it from the most relevant image from the stream combined with a title that is readable above all else, even if it is not particularly beautiful. Ain’t got no time for that.

The amount of blogging I do has already been significantly reduced due to all the videos. I’m okay with that so long as the major keywords and concepts are written somewhere that can be found on the Internet through the normal searches. That said, I will be producing far less written content, that’s for sure. I can convey so much more verbally — especially with body language, which makes sense since that is the natural best form of human communication. As a bonus, it means I can communicate much better with Russians and French people because I speak those languages but have a hard time typing them. At the end of the day I’ve always been a better verbal communicator than writer, although I do love writing.

One potential side effect of this is that I will free up time faster for writing creative works rather than all this tech/work stuff.

One huge casualty in all of this will be the README World Exchange project, which is now very secondary to getting as much content created as possible as video. After that the priority is distilling and organizing those videos into lessons with a path and as much written material and exercises as possible. All the rest of the projects after that are tertiary, including, I’m afraid, the RWX project.

So, I will triage the README WorldPress (rw) tool that I have and get it using Pandoc for builds again and finish the search indexing with the goal of getting skilstak.io back up to date and both rxwrob.live and robs.io split off of it.

The SOIL project is also all but officially dead. Another academic streamer and I talked through that whole burden of governance for even the most loosely organized groups and he basically talked me out of it by reminding me of realities somehow I already knew. That no matter what, you will always have drama when there is more than one person involved in any endeavor. We agreed a BDFL is a better approach and is the governance model for most successful open source projects including Linux and Python. Even having a SOIL convention would incur massive organization and lost time and for what? To spend a lot of time to come together and learn and discuss stuff that we could more easily cover on a collaborative live stream? No, the future — and present — is streaming. Conventions are dead. They really are. It is great meeting people in person and that will always be a thing, but it will never produce the total value that doing all of the same thing online can do. (One-on-one, in-person mentoring is an entirely different thing, and has the potential to work online as well, but I have yet to successful test that premise.)

It’s worth noting that the extremely brilliant minds and souls behind oxide.computer seem to have arrived at something similar to this conclusion as well. They are podcasting and building, period. There is something to be said to keeping laser focused on good output, be it content or code, and just sharing what is happening to people can find and follow what is happening. It’s also the most inclusive approach. As much as I want to go to defcon it will probably never been a reality — even if it was held in my home town. So I’m getting a lot of internal confirmation that this is the right path to take with all of this.

Twitter posting has also dropped to zero. I only post really good stuff and links to videos that are out and invitations to specific live streams. I think everyone will enjoy that more because the content is more specific and they can choose whether to just shoot the breeze or check out the more polished videos as they come out. At some point I have a feeling I will be picked up by the writer’s themselves and can perhaps get an interview with one or two on stream and I have to prepare technically for that possibility above all others.

Another project that I’m nixing is the “couch talk” here in the studio. It’s far more important that I get the ability to have phone and Discord and Slack and Skype interviews and conversations on stream than anything I can do in world here. It would be certainly fun, but it’s secondary to the content itself.

I am not nixing the remote hack-the-box type games and Easter eggs in my web site. They have massive appeal and reach not only those that I mentor here in person but everyone on the Internet. That is even a higher priority than finished the RWX project (but not the RW project, which I need to be able to document things as quickly as possible). Humm, maybe I could keep RWX ultra lightweight and just be a place for people to post their RW sites. That might be doable within the same parameters.

To summarize, my single top priority is producing as much high-quality educational content as possible in the core domains that I have always been focused on. This means videos about all the introductory stuff that I do IRL with my mentored sessions to help everyone find their bearings and path, then it is annotating the main books and certification outlines in an organized way, not quite as organized as taking a course, per se, but enough that people find value in it enough to motivate them to continue with their learning (and save as much money as they possibly can).

## Monday, February 10, 2020, 6:44:31PM

Twitch’s sending out of email notifications that I am live on stream when I’m actually not is getting really annoying. At this point it is looking like it is doing something to detect changes to the visual stream itself. So when I pop on to write a quick blog post everyone gets blasted with “Rob’s Live” and that has to be annoying. I feel like I need to apologize on Twitch’s behalf every time that happens.

## Monday, February 10, 2020, 6:41:03PM

Behringer music equipment is absolute shit. I’ll never buy another product from that company. Allen & Health all the way. It’s always worth the few extra dollars to get a quality product.

## Monday, February 10, 2020, 3:59:57PM

A great new friend on stream suggested I look at Eloquent JavaScript which I had bumped into a few times but not fully read. I’m also pretty sure I was looking at an outdated version of it as well. The new version is amazing. It is (so far) on par or better than Learning JavaScript (my current standard recommendation). Imma ready both again compared to each other and make a decision about which one should come first. Unfortunately, neither has good example and exercises, but that is okay so long as readers approach either as source from which to create their own codebooks and example exercises that they can repeat. I keep running into this need for exercises that are specific to the concepts. Perhaps that is an area that I can focus on specifically and attach to these works to help those out who are reading them.

The fact that eloquent JavaScript is entirely Creative Commons almost makes me want to use it over the other one just because of that. I could even fork it and supplement it with the stuff I feel is perhaps missing. Now there is an idea, get behind an open work rather than making yet another one. So long as a consistent voice can be maintained collaborative authoring could work, even if we have to work around the fact that the book is written in the first person (which I would probably do myself).

## Monday, February 10, 2020, 2:26:06PM

What if everyone streamed? Would we bring about Datagedon? I think the answer is no based on base on pace of storage technology improvements. I had a PowerMac with 30MBs of storage in 1997. Today you can buy a 4TB hard drive for under \$100. So even though we are all producing a lot more date — especially video data — we are currently outpacing that need by several orders of magnitude. In other words, it’s okay to rely on streaming video as a medium over the written word in terms of sustainability.

## Monday, February 10, 2020, 1:55:32PM

There’s only so much time in the day. Been thinking a lot about the nature of live streaming, not just for coding, but more of the Just Chatting sort of thing. Recently I was able to have some great conversations that were only loosely associated with tech, super rewarding stuff.

One on dilemma’s I’m facing not that I stream all the time is the impact it has had on my writing (blogging). I used to blog all of these random thoughts to put structure around them, but now I’m find just talking them through (mostly to myself but also the stream) is rather effective as well as cathartic, not to mention the bonus of randomly meeting and getting input from others all over the world.

Fatigue is an issue. I mean, I type pretty fast, but being able to articulate an idea on video with voice is far more accurate at capturing the idea. Words are far more risky when they are written. Intonation and inflections are great things we employ in spoken language, but are lost when written. Hence emojis, etc. I wonder if this is why we’ve become so obsessed with video communication. It makes me ask where streaming fits into it all.

I have a feeling I’ll be making a lot more videos than blog posts now that I’ve discovered this medium (pun intended) and sticking with blog posts for things that require a lot of written material, for example, coding exercises, snippets, URLs, etc.

These days it’s somewhat safe to assume the videos I make will live even beyond my lifetime. This raises lots of questions, but YouTube does not delete and accounts are free it’s possible that our video blogs (vlogs) could best represent out thoughts and discoveries well into the future.

Writing is still required to pull up in search results. So there’s that to consider as well.

## Monday, February 10, 2020, 12:00:03AM

Need to look at Pretzel for music so videos on demand (VODs) don’t get muted for copyright issues.

Also found zorchenhimer on Twitch who coded a couple NES games in Assembly.

## Saturday, February 8, 2020, 6:01:39PM

Need to keep status line it it’s own file so that the schedule and plan stuff can remain unaltered, like another record in a database. Kinda makes me rethink .plan and .project and want to just use them since they have been around forever.

## Saturday, February 8, 2020, 4:18:19PM

It’s been a while since I’ve used IRC (I’m very sorry to say) but since Twitch uses it for everything it has sparked the interest to get into it again and help others understand it.

Something I’ve noticed is that Linux Mint comes with Hexchat preinstalled with server called SpotChat and #linuxmint-help standard. This is where the Mint team has decided to do their main support. I really need to hit it harder because people can get immediate answers to their questions there. This means IRC is very much the core tool I’ve known it to be when I added it to PTP. Twitch has just accelerated the need to cover it.

## Saturday, February 8, 2020, 2:30:54PM

So many things running through my head today related to the blogging thing. One decision was what time period to divide into. I’ve thought (and blogged) about this before. I’ve settled on blogging by week and using the week number of the year (52 maximum) as a measure instead of months and days. The URL is much smaller and frankly no one ever types in one of those monstrously long blog URLs. Being able to give a succinct URL trumps all of that. For example, this is the first post in a blog with the new format that people can find with nothing more than https://robs.io/2020w05. That is the shortest blog URL I have ever seen.

There’s really no need to but blog in the URL. It’s obvious. Plus the entire appeal of README Repos is that they are personalized to begin with. Anything starting with a format like that can easily be assumed to be a collection of blog posts for that week. The URLs (which are README module IDs) will never conflict with an actual module name. But — most of all — the URLs are short and always will be.

Another great side effect of using week numbers is I really get a sense of time progression using them, much more than months. I can recall that things were some number of weeks ago and suggest others go back and look at when they were. “That’s in week 3 or 4 I think,” is the type of thing I could share.

I’ve decided not to put any kind of heading in the blogs as well so that the most recent stuff is always on the top, no introductory paragraph to read or title to scroll past. The title can still be specified, of course, in the YAML front-matter as with any Pandoc document.