2020 Week 9

Sunday, March 8, 2020, 8:15:29PM

Alias -> Exported Function -> POSIX Shell Script -> Bash Shell Script -> Compiled

Sunday, March 8, 2020, 3:10:01PM

Twitch wins again!. New friend on twitch just shared this ext-based pastebin alternative https://ix.io. I am in fucking terminal heaven over here. I simply cannot put a value on the number of amazing insights I have had over the last month on Twitch.

I am so fucking blown away by how useful this is. You can create a completely anonymous post from the command line by just sending a file to the ix command after creating a simple ~/bin/ix script like this:

argsorin "$*" | curl -F 'f:1=<-' ix.io

The argsorin is a useful script I made some time ago to intelligently detect arguments and smoosh them together or read from stdin for here doc syntax and stuff:


argsorin () { 
    if [ -n "$buf" ]; then
      echo -n "$buf"
    while IFS= read -r line; do
    echo "$buf"

echo $(argsorin $*)

Here’s a simple example:

ix echo hello world
curl ix.io/2dLp
echo hello world

I usually write this stuff as functions so they can be cut and paste easily if people want to just incorporate into their own code rather than invoking a sub-process just for the command.

Sunday, March 8, 2020, 2:40:42PM

Need to take a look at https://liveoverflow.com which fox_maccloud is recommending for picking up memory buffer overflow vulns.

Also need to get a resources wiki-ish page ready to hold all these references.

Sunday, March 8, 2020, 1:18:05PM

TIL you can add -x to bash to cause it to output the content of the script as it is being executed. No more multiple echo statements. Huzzah!

Sunday, March 8, 2020, 1:13:46PM

Zsh, or Z shell, was first released by Paul Falstad back in 1990 when he was still a student at Princeton University.

See that continue to be a problem. People make things during college that should never have existed and somehow people adopt them as standards. The simple vimtutor (which is totally broken) is just one such example.

Sunday, March 8, 2020, 1:11:20PM

There’s something miraculous about purging the angst of hitting horrible technology (that I cannot legally hack despite my initial impulse). Shameful Shitty Sites is one way to deal with it. Who knows maybe it will pull up in an Internet search one day.

Sunday, March 8, 2020, 12:12:02PM

Still feeling the pain of that major burn setting up integration of Bash command functions with applications that must call them from the exec() system call (rendering them useless). I still feel they are far more useful than aliases and should be used for anything that will be called from the shell. In short, the best way to stay good is simply think of them as shell extensions, period. Going too far and thinking of them as commands will eventually trip you up (as it did me with TMUX status).

The interesting question is then what language — specifically which shell version — to write when creating actual command scripts. To me the answer is pretty obvious: POSIX shell. These days the closet shell to POSIX is dash without question. It is behind /bin/sh on all Debian systems and is wicked fast. Since dash is already loaded into memory (like bash) there is a good chance the interpreter loading time is saved as well (which is not the case with Python or Perl).

So the moral of this sad tale is keep your functions as extensions of your shell and write anything else as Dash scripts in ~/bin. In doing so we keep everyone happy because /bin/sh might be pretty much anything on a given system and because your command scripts are all POSIX there is never any fear of them not running on something.

Unfortunately this means that learning the shitty sub-process intensive approaches are still required to maintain POSIX compliance. (Apparently the world isn’t ready for extreme efficiency like that.) If you do want it, and don’t particularly care about portability, just write it in Bash and be happy, but not until you need a Bash-ism like good regular expressions or avoiding a subproc monstrosity. After all POSIX was made for a completely different world with different constraints. We don’t use many dynamically linked libraries now, for example. Everything seems to be into static linking, and usually for good reason. POSIX isn’t dead, but it sure isn’t required like it was anymore either.

As for me? Everything starts as a command function until I need it from another program (exec() syscall). Then it gets ported to a /bin/sh POSIX script. Then if I even have to make one subproc I change /bin/sh to /bin/bash instead and live happily ever after. The end.

In most cases someone coming across /bin/bash who is using Ksh or Zsh should just be able to change the name. But that’s no longer my problem.

Sunday, March 8, 2020, 7:37:08AM

While working on my TMUX status line and realizing that my exported bash command functions were not usable using the #(status) syntax I did some digging and found this information.

This really had me challenging this new way of managing what I had previously managed as a collection of stuff in ~/bin. In fact, it has caused me to seriously consider when to even use them.

Sometime in November I started to rewrite my Bash configuration scripts. I noted the speed of exported command functions to be ridiculously faster than invoking shell scripts instead. So being the freak I am when it comes to efficiency I ported most of my utility commands into command functions instead that are available to be run at any moment using the export -f myfunc syntax, which is not available on anything else.

Not only are command functions much faster than actual command scripts, they are far easier to maintain since they all go in some version of a bashrc file which can be easily distributed.

I have just read all about “ShellShock” also (“Bashdoor”) which happened in 2014. I’m completely and totally blown away by the scope of that vulnerability and how I have inadvertently fallen directly into a horrible trap. I now have to help others avoid it and responsibly describe how I have made a very understandable mistake to use command functions at all (despite all the shit I say about zsh not supporting them). There’s actually a very good reason not to (or at least there was). By adding on to the end of the exported environment variable containing the command function hackers created the biggest shell exploit in history. The fact that this bug went for so long without being caught is unfathomable. Frankly it is a huge embarrassment for the Bash team. Everything has been patched, but no wonder so many Linux people have avoided them like the plague despite their utility.

Does this means I will be converting to zsh? No way. But it does mean that creating a ~/bin directory (as I have done for more than two decades) is still the best way to manage all your utilities.

The single best criterion as to whether something should be a Bash command function or an actual command script is whether or not it will every be used by anything but bash. So, for example, creating command functions for stuff that is frequently paired with !! in vi is fine since it always shells out to Bash, but any of those functions are to be used from anything but Bash it might not be so useful.

This has me still horribly conflicted between the modularity and convention of the ~/bin approach versus the decidedly only Bash approach of using command functions.

Saturday, March 7, 2020, 2:54:43PM

So looks like there is color emoji support in Alacritty in the latest version by providing a font “fallback” for them (instead of the black and white versions). Very helpful people on the IRC channel.

Saturday, March 7, 2020, 1:34:04PM

Been reminded recently of the importance of learning a command-based editor like ed (after recently talking about Rob Pike and his famous “ed is the only editor you need” statement) is the only editor you often can use when you don’t have visual mode after hacking into a system through a service that does not support a full TTY. I remember doing that with Metasploitable some years ago for our Summer camp and being like, “Hummm, so there actually is very real value in learning ed.” Now I’m trying to determine if ex (the successor to ed) is just as good. I notice that it is now combined with the man page for vi so not sure.

I want to create a simulation of a situation where ed is all you get and how to do that. I think something as simple as a basic script tied to nc should do the trick, sort of a local, hackable service just to simulate lack of a full terminal environment.

In fact, a friend from the Twitch stream shared the fun fact that Joy gave up on vi because there were so many different versions and just used ed instead. vi didn’t even start in visual mode by default. He added that a couple months later after noticing that nobody was using it as a line editor. Apparently there’s a whole video and another on the history of it all.

(God I love having a stream community.)

Saturday, March 7, 2020, 1:29:18PM

Out of curiosity I actually looked up how sed suddenly had in-place editing. Turns out it has been in the GNU version the whole time, which is why I always used perl -p -i -e instead because I knew it would be on every UNIX system, not just GNU stuff. But given the widespread use of Linux now I suppose using sed -i is good to know.

It’s not unlike things like mkdir -p /some/long/path which is unsupported in any of the original POSIX mkdir versions (another thing GNU added). I use it all the time now, but knowing when and how stuff changed is something of an obsession.

Saturday, March 7, 2020, 12:41:19PM

Gabe brought in his new Razer Blade 15" 2019 is reporting on it. I’m completely convinced it’s the go-to system for most pentesting and forensics (not to mention best gaming laptop). He had problems initially with suspension from screen closing (because he installed the default Mint kernel on 19.3 ISO was too old) upgrading the kernel fixed it.

The chassis is the most solid I’ve encountered and I love the keyboard, sturdy and chicklet-y. I destroy keyboards so that is a bare requirement. Definitely my next purchase when ready to go mobile again for everything. My (bork) XPS will have to do for now.

Gabe says make sure to get it from Amazon since price and warranty is better. He bought the four year coverage including accidents.

Saturday, March 7, 2020, 12:26:40PM

GitLab sucks when viewed with Lynx. Need to open a ticket.

Saturday, March 7, 2020, 12:23:12PM

TIL that screen now has splitting panes. The question came from a long-time screen user about why he should consider TMUX. A bunch of things came to mind, but these are best summarized here. The author left off the customization of vim bindings for navigation between panes and resizing them, which I could not live without.

I did find it extremely validating that the screen default bindings for splitting us the same exact keys I have set in my tmux.conf file (rather than the bork defaults).

Saturday, March 7, 2020, 11:13:57AM

Back on standard Xterm colors. There is something about it that is beyond “retro” feel. The colors are very solid. The contrasts are amazing on any device at any size and dimensions. And people see what they will likely see when they pull up a terminal for the first time on Raspberry Pi or whatever. Plus asquiiquarium looks as expected.

Saturday, March 7, 2020, 10:28:07AM

Last night someone showed us a video about a seriously mentally ill genius who created their own compiler and operating system and everything that goes with it including some really crazy graphics. At first it was intriguing but by the end I was sad and pissed for having seen it. I know the person mentioning it wanted to explore the possibility of making an OS and was pointing out the extreme intelligence involved, but the video seemed to linger and dwell on this person’s descent into madness for no other purpose than to sensationalize the drama. I suppose capturing the story is one thing, but showing all the videos of his worst moments crossed the line. I didn’t need that. It triggered my disgust with people who laugh or just overly sensationalize any mental illness. I think I might of hurt some feelings without meaning to but it’s not okay. Mr. Robot deals with that topic so perfectly.

Saturday, March 7, 2020, 1:04:02AM

So today’s OTW had a lot of SSL port stuff in it that was nice to see because last time I did anything with port connections and enumerations was in the 90s (other than for fun here). The combination of openssl and s_client is absolutely amazing. It takes all the hassle out of the SSL/TLS complexity when connecting to ports allowing connections to SSL enabled ports without having to negotiate all the cert shit. Here’s one to remember:

openssh s_client -ign_eof -connect <host>:<port>

It is the same as plain old nc of old. It waits around to see what you type in and otherwise passes the whatever you paste in. The -ign_eof holds the connection open for interactive sessions. Looks like you need this and cannot just pipe to stdin as you can with nc but I’ll research that more later.

Friday, March 6, 2020, 10:13:32PM

Turns out the sound corruption is from OBS Studio. The fix is to exit OBS completely and the app using sound (usually the web browser) and then restart the app and restart OBS.

Friday, March 6, 2020, 7:56:55PM

With all the focus on pentesting it has really warped my sense of software development. For example, learning everything there is to learn about CSS is simply not needed. Creating the G0CT has been one of the best thing for SkilStak because it has really forced me to focus on only those things that have very broad application to all foundational technology careers. For example, there is no need to learn Vue or React, only vanilla JavaScript with a solid understanding of the DOM and focus on HTTP (along with other networking).

This is coming up now because even though I stand by the recommendation to get Jennifer’s Learning Web Design book most of it is irrelevant to the G0CT. It has such solid information in it that everyone should eventually work through the whole thing. But I plan on getting very specific about the sections and projects that are required just to get a good initial understanding leading to the certificate evaluated project of creating a personal/professional web site from scratch with the minimal requirements:

The title of the sub-cert, Knowledge Content Creation, also conveys the goal of it better. Sure understanding cross-site scripting vulnerabilities will require a lot more JavaScript understanding later, but not to get started.

In short the G0CT should equally provide a foundation to all of the following fast-growing career paths:

As I go though the vetting process for all the content I already have and decide what stays and what goes or gets put into another category or cert I’ll be applying this check list to make sure whatever is there applies to everything.

Earlier I was struggling a little because I had “Web Developer” in the list, which does not require Linux and command line skills beyond basic navigation. But a Full-Stack PWA Developer needs much, much more — including a full understanding of Bash scripting, HTTP, and the use of curl.

Wednesday, March 4, 2020, 4:46:42PM

Rather conflicted at the moment about this interpreted language to pick back up again. All of the following are supported by weechat and are prevalent in the Metasploit database:

They are all mandatory languages at some point to learn. The question which do you choose when all are available.

I did find it interesting that there is no Node in any of them.

I think, as I concluded earlier, that I have to get over my “bad breakup” (Brian Cantrill’s term) with Python and just accept that it is a required secondary language to maintain. Here’s all the justification for that:

Have to look at nim as well. But as a general purpose language Python is mandatory learning for the G0CT certification, there is no doubt of that (as much as I might hate that to be true).

Wednesday, March 4, 2020, 12:57:52PM

Discovered today that the pretty graph that shows the volumes of a sound stream is called a VU meter. Turns out there is a really good one for terminal, which means, of course, that I had to get it. God I love what is possible with full color terminal support all over the place now.

CLI Visualizer

Tuesday, March 3, 2020, 8:55:32PM

For the first time I actually miss Arch Linux. It definitely has more up-to-date packages and shit I want to use (weechat) is waaaay behind on the standard Mint distro, which is to be expected but still annoying.

This means I have to make a decision, leave Mint and go down the Arch rabbit hole just for my own stuff while maintaining a healthy relationship with Mint so that I can help beginners get started on it, or keep Mint and suffer through extremely old stuff.

There is no doubt that the core Debian team is on top of things, but the Arch team is where all the edge energy is from the Linux development community. That’s a given.

I’m torn because I want to join that community (again) and to be there you really just have to have the distro they have decided to use, which seems to be Arch at the moment.

I’m inclined to put Arch on my main system, but doing that would mean that anyone on the stream who is just beginning would maybe be misled to go with Arch before they are ready for it.

I had considered just using a VM or streaming a Kali system next to me (which I’ll have to do anyway for OSCP prep). It is Debian after all. But that has all the stuff I need on it already and I would not be showing how to do installs much. I can give this some time and see how much Kali I’ll actually be using all the time. I could make Arch my main distro but usually have it set to Kali for most work for the OSCP prep stuff and even treat Kali like more than just Kali.

I wonder how well a Kali Linux system would stream. If it streamed well I could set that as the main system since almost all my work will be using tools that are on it and showing what those tools are. It makes sense to consider how easy it would be to setup a stream from it. It’s likely that the problem without outdated packages and not being on the edge of Linux development will also affect Kali, possibly more in fact.

I have some time to think this through. I could use this capture card and use a projector to have more than one CPU associate with a single screen essentially and could switch back and forth between them just with OBS.

Tuesday, March 3, 2020, 6:14:41PM

WeeChat has definitely sent me down a great rabbit hole. In fact, I no longer find even the slightest interest to participate in any chat that is not IRC. I completely understand why the “1337” hackers only use IRC, it is so ridiculously more productive than any graphic chat application, that much is clear only after about a hour playing with weechat. I so so glad Twitch has built everything on it. I could not be more pleased with their decision.

Time to go find some more Freenode channels.

Tuesday, March 3, 2020, 5:23:29PM

I absolutely love that weechat is entirely written in C. I’ve decided that is the first thing I’m going to learn in the IRC land. It support Python and Perl plugins, and frankly to become a seriously good Pentester everyone should learn Python, Perl, Ruby, and TCL as well as Bash even if Bash can do everything all the others can. People need to at least be able to read the others at least since so much that is written in the Metasploit database is one of those languages as well as plugins for different hacker-y things like weechat, which is really the new BitchX.

Tuesday, March 3, 2020, 5:09:56PM

Python is officially back in the house. Python most perfectly fits that place of making significant software that is not yet worthy of a full port to a statically compiled language. It is the best for data analysis and it’s dominance of systems automation with Fabric means that it will retain its position as a systems glue language despite Go’s supremacy. It is a very small niche, to be sure, but enough justify learning it for any G0CT.

The dilemma I have is to bring Perl back in addition to Python. It is still a very powerful language, far more powerful than Python specifically for pentesting. Things can be done so much more quickly in Perl.

Monday, March 2, 2020, 8:39:45PM

After hacking my way into https://www.hackthebox.com (don’t forget the www) and watching the earlier OSCP guy’s video it has become clear that there is another mid level between G0CT and OSCP that involves everything needed to even create an account on hackthebox. It requires significant JavaScript knowledge as well as HTML and HTTP and base64 encoding. Those are not beginner things in the slightest.

Monday, March 2, 2020, 5:34:51PM

This guy says that his “report” was 240 pages of markdown and screenshots converted to PDF. So yeah, knowledge source management is a critical dependency skill in all of this. Love that this validates including it in G0CT as well as everything with the README World Exchange.

Good advice on the “memory buffer overflow” machine:

“Setup your payload and shell code so you can go back to it, not you … That way there’s no need for them to regenerate shell code. There’s no need for them to create their rhosts and rport. Make it one-way so you can callback to it, not to you.”

The callback is the thing running that grants access. He’s saying to put all that code on the remote machine so that the overflow code calls that and not something that initiates a call for that code from your local system. That’s pretty obvious stuff because that is exactly what a backdoor is, opening something you can login to the remote target.

Reading that the term “enumeration”, which has a rather specific meaning in the software world just means capturing your discovery results as the initial phase.

::: Enumeration is used to gather the below * Usernames, Group names * Hostnames * Network shares and services * IP tables and routing tables * Service settings and Audit configurations * Application and banners * SNMP and DNS Details

Enumeration can be performed on the below. 1. NetBios Enumeration 2. SNMP Enumeration 3. LDAP Enumeration 4. NTP Enumeration 5. SMTP Enumeration 6. DNS Enumeration 7. Windows Enumeration 8. UNIX /Linux Enumeration :::

Enumeration is more than nmapping, it is “enumerating” all the information available from a given port. So SNMP has a lot of information about the system (snmpwalk).

The fact that “enumeration” is considered a basic foundational skill for OSCP implies there is a lot of base learning required far beyond what I was planning for G0CT. I happen to know a lot of it, but someone who hasn’t been a system admin or doing any patch auditing is going to really feel left in the dark on that stuff. Something like G0CP -> ???? -> OSCP that would contain all the material from lpic-0, klcp, lpic-1, network+, a+, security+, but just not suck so badly.

Apparently this guy “took a lot of breaks” which is going to be really important for me. At least the test is from the safest place you can find and not their testing center. Instead of “try harder” he suggests:

“Take a break and try your process again.”

I really love that. Trying harder means you’ll just produce cortisol and it will block you from seeing what will be obvious to someone else or yourself after you’ve take a good enough break.

I love that he didn’t use Metasploit at all. I fucking hate it. It’s a script-kiddy thing.

Also he says don’t waste time on kernel exploits even though the kernel versions are old because they are not intended to be the main vectors.

He finished in eight hours but he sounds like he was really prepared.

Monday, March 2, 2020, 5:21:23PM

So the first thing I learned about the OSCP process is that it takes up to a month just to get your packet and lab access after your application, and another full month before your request to take the actual test goes through. That means there is at least two months of lag time for those who might not even need that long to do everything (which I definitely will).

This means that there is no way I could reasonably hit the OSCP by DefCon, instead DefCon will be an amazing way to tap the minds of a bunch of people who have received theirs and potentially record them to share with others. If anything DefCon is sort of a mandatory expense (to me) to covering as much as I can during the process and documenting it for everyone.

I also need to spend most of my free video watching time watching all the others who have received their OSCP and start cataloging them.

Monday, March 2, 2020, 5:11:36PM

Can I just say how fucking happy I am to be free of concern for teaching and keeping up on web shit. The only thing I ever want to learn about it is how to hack it. It has created such a level of stress for so many years. It is a good skill and creating a JAMstack site is required learning, but all the rest? Fuck no. I’m done. It’s so completely liberating. Mostly I’m just happy to be rid of the entire web development — specifically JavaScript community — which is filled with horrendously horrible human beings.

Monday, March 2, 2020, 4:51:42PM

Perhaps the coolest thing about the OSCP is that it fundamentally forces you to build your continuous learning skills and creativity. This is exactly what terrifies people because they haven’t been taught to do research and learn on their own. They’ve had their creativity beaten out of them by the school system. The OSCP demands high technical skills, but more than anything it requires solid research and self-evaluation skills.

Monday, March 2, 2020, 4:32:28PM

Found a great OSCP resource that I think does a good job covering what people should expect. It was interesting that Assembly and memory buffer overflows are a part of the OSCP. That definitely puts the OSCP in the category of not-for-noobs. In fact, unless you have done some C coding I have a hard time recommending it to anyone.

I found the resource while looking for the required languages for the exam. Obviously having a really solid handle on shell scripting is required (which I have), but specifically I’m wondering if they have a Python bias anywhere in their requirements.

One thing mentioned is the amount of research required to complete the 55 labs of different difficulties. It made me smile that I can research faster than 99.9% of the rest of the population with Lynx and the techniques I’ve developed over the last 20 years. Hacking is really about knowledge more than skill. It confirms my selection of content for the G0CT I’m creating. Reading the overview of OSCP prep is also good to make sure I have all the building blocks leading up to any tech career, but specifically for someone headed to the OSCP. Linux, for example, is not fundamentally required to become a web developer. But it is for pentesting for sure.

Monday, March 2, 2020, 4:15:09PM

I’m so glad that my subscribers went down by two. I know that is a weird thing to say, but the reason is because it means that people are self-filtering, which is great. Here are the people who leave:

There. That’s all I needed to get off my chest. Now on to stuff that matters.

Monday, March 2, 2020, 11:18:03AM

Did about an hour of research on SourceHut.org and ultimately concluded the following:

Source Hut Languages