# Useful homebrew formula

Last week I was telling Paul about things I have brew installed on my work laptop. I pulled a full list as I was doing some upgrades and stuff this morning.

Glancing over it a bit, here are some of my favorites/most usefuls:

• autojump for a usage-adjusting cd -- given a partial string it just goes to the most-used directory with that prefix. I have autojump aliased to j to shorten it up even more.
• colordiff to make diffs/grep/etc nicer to look at in the terminal.
• coreutils/dateutils is nice because it installs the GNU versions of a lot of standard linux commands like du and ls and date that work a bit strange/differently on OSX.
• jq is amazing — lets you query JSON objects.
• ledger to manage bank balances in plain text.
• newsbeuter to subscribe to RSS feeds.
• osquery lets you do SQL-like queries to find out about the system it’s running on (sharvil did a bunch of work on it, actually…)
• terminal-notifier to pop up notifications from terminal/cron jobs.
• tmux to have long-running terminal windows without having terminal windows open.
• watch instead of "some command"+uparrow+enter+uparrow+enter+....
• wget as shorthand for “curl http://xyz/tuv.json > tuv.json” (old habits…),
• youtube-dl for downloading youtube videos.

# KLR-650 first month & american gumball

Yesterday I finally got the title for my motorcycle and realized that I'd owned it for exactly one month!

So far, I haven't done anything very adventureous -- still getting used to riding again. I had a couple things that were making me nervous about it:

1. The bike gets warm (>1/2 way on the heat gauge, but still significantly below the red) tooling around, usually on my commute home in the evening.

2. There is a sputtering or almost backfiring when I am coasting down hills, which is several times each direction.

Turns out that both are normal! Hooray!

I was thinking about other things I want out of life and ended up coming across the American Gumball Rally website. In highschool, Beals and I used to watch "documentaries" about people on the Gumball 3000 and Mischief runs, their expensive cars, and their (pretty crazy) disregard for speed limits. I pitched Beals on doing the Nevada America Gumball Run in 2017, and we immediately got into whether his Focus or my Focus got to wear a racing stripe and do the honors, when Beals dropped a bomb on me:

too bad you don't have your f250 still, that would be an epic ride

It turns out my parents do still have the '91 Ford F250 to which Beals was referring. There's one catch: the truck is in Prosser, WA, and I'm in San Francisco, CA, and Beals is in Portland, OR. Okay, so, we'll take on a road trip:

Correction: two catches. The second is that the truck transmission and alternator are dodgy. And per catch (A), we have 14+ hours through Oregon and Nevada. We'll just have to bring some way to get ourselves out of trouble if it breaks down somewhere. Something that could fit in a pickup. Something like

So yeah -- that's the plan:

1. Drive the bike to Prosser.
2. Load the bike on the truck.
3. Drive to Las Vegas with the truck and the bike.
4. Do the rally.
5. Drive the truck back to Prosser.
6. Ride the bike back to San Francisco.

Boom -- problem solving.

# stocks and options from 30k feet

One of my friends at work asked me if I had any book recommendations for learning about stocks and options. Mentally, I break trading down into two general classes of trading: index-type and "exotic" trading. By exotic trading, I mean picking individual stocks/options and actively trading. This runs counter to the more conservative buy-and-hold, index-based, hands-off approach.

For the exotic trading, I learned most of what I know from a class with Professor W.E. Olmstead and his book, Options for the Beginner and Beyond: Unlock the Opportunities and Minimize the Risks. For the option-uninitiated, the basic idea is that instead of buying or selling stocks directly, you buy and sell contracts that give you the right (but not obligation) to buy or sell the stock at a particular price by a particular date. That's a mouthful and options are indeed subtle beasts, but they allow the flexibility to either hedge risks you want less exposure to, or increase/leverage exposure to risks you do want to take.

Here's one pretty basic example, called the "Covered Call" strategy. Suppose you:

• buy 100 shares of SQ (currently trading at \$11.12 per stock share or$1112 total)
• sell 1 DEC16-16 \$13 SQ call contract (currently trading at \$.25 per option share, or \$25 per contract) You get the$25 account credit when you sell the initial shares; your eventual profit vs loss on this trade breaks down as follows:

• if the stock price is above $13, the option is likely to be exercised, which means you will be paid \$13 per share for each share of the contract ($1300 overall) and the counterparty takes ownership of the shares. • If the stock price is below$13, the option is likely to stay unexercised, so you keep the 100 shares of the stock and the original \$25.

This sale can be repeated as often as the option expirations are available, which is usually every month. Exceptions happen in both directions, though: some (especially newer) stocks have less frequent option expirations, while some (extremely high-volume) stocks have more frequent option expirations -- as often as weekly! Furthermore, in the unexercised case, you don't need to buy the stock again since you still have it from the last round. With enough iterations of this, one can even pay off the original stock purchase purely from writing the covered calls.

That's not to say this is easy, though! If the market considered it impossible for the stock to go that high, nobody would buy the option contract. So the trick to this strategy is finding a stock and predicting the movement, and actually being correct.

As discussed above, you can use Google Finance to view the stock price history and the stock option price chain:

These quotes, however, can be delayed; most reputable option trading companies will provide an up-to-the-second view of the option chains.

Turns out that finding, predicting, and being correct are pretty hard, and a fair bit of distraction, though. So instead, I mostly opt for the boring route: investing in index funds and riding the standard stock-market growth curve. As best as I can tell, this is pretty close to the view espoused by Bogleheads.org(though I admittedly haven't read the book): Invest a steady stream of earnings in an index fund with as low of management fee as possible, and let it ride as long as possible. Index funds are a kind of "synthetic" stock that tracks multiple other stocks. For example, QQQ tracks the performance of the Nasdaq-100 index, the DIA tracks the performance of the Dow Jones Industrial Average, and SPY tracks the performance of the S&P 500.

Disclaimers: The SQ stock quotes listed above are just illustrative examples and not trading recommendations, though I am long SQ as of the post date of this article.

# streaks vs statistical streaks

Hacker News et al are obsessed with streaks, but I think they have some problems:

1. A single regression resets to zero.

2. There's not an easy way to gradually ramp up your streak-commitment over time.

I prefer a different approach: statistical streaks.

Suppose I made a commitment to do something differently on 2016-08-26, and did it for the next 5 days; then my 30-day statistical streak avg = 0.166, but my 5-day statistical streak avg = 1.0.

If I then have one off-day (5 consecutive days then one failure day) the 30-day statistical streak avg is still 0.166, but the 5-day statistical streak avg drops to 0.80.

If I pick it back up the next day, it goes to 0.2 and 0.80 for the 30- and 5-day statistical streak averages (respectively), while if I fail again, it drops to 0.1667 and 0.6 for the 30- and 5- variants, respectively.

Here's a tiny bit of python to demonstrate these scenarios:

    assumed_history = 0
explicit_recent_history = [1,1,1,1,1,0,0]

for window_days in [30, 5]:
# lazy...
history = [assumed_history]*(window_days)+explicit_recent_history
recent_history = history[-window_days:]
success_days = sum(recent_history)

print window_days, float(success_days) / window_days , recent_history


# i fucking love mangoes

Specifically, the Philippine Brand Dried Mangoes. They are amazingly good.

But there is a slight problem. Costco sells them in 30 oz bulk packs, which is 20 servings at 160 calories/serving or 3200 calories total. Left to my own devices and lack of self discipline, I could probably eat (and subsequently regret eating) the entire bag of mangoes.

I need a self-check to disgust myself and impose a bit more of a speedbump than an already-opened bag provides. So I fired up the FoodSaver and made individual serving sized-bags:

Each was individually weighed to be one serving:

I ended up making 20:

# switching over to https

One of the things I've been meaning to do forever is switch things over to https. By "things", I mean the set of websites I run for some family and friends. I tried it out with my personal website first, then flipped over the rest.

## implementation notes

1. I used the letsencrypt start guide to generate the certificates.
2. Modified the nginx config to: a. serve ssl/https traffic on port 443 for the given domain with the proper https certificates/etc. b. forward non-ssl/http traffic on port 80 to port 443 for the given domain

## verification

It turns out that the nginx configuration files are a little bit error prone. This probably means that I am doing something wrong, like not using some configuration management tool like puppet or ansible or whatever. But for something as small scale as my site, it doesn't really meet the cost-benefit threshold for learning a new tool/language. I also even considered spinning up a simple one-off configuration generator that I'd need to figure out how to override and extend as needed.

My first step was to write a simple requests call that can get the response given a scheme and hostname:

def check_scheme_host(scheme, hostname):
url = "{scheme}://{hostname}/".format(
hostname=hostname,
scheme=scheme,
)
try:
resp = requests.get(url, allow_redirects=False)
except Exception:
return None
return resp


Note that our check disallows redirects, because otherwise requests would just helpfully follow the 301s. Since the 301s are part of what we're trying to verify, we don't want to follow them.

Given that that function, we can define our expectations for a given check. In particular, we can define a function that expects a certain response code for a given hostname and scheme:

def expect_check(expected_response, hostname, scheme):
actual_response = check_scheme_host(scheme, hostname)
if actual_response != None and actual_response.status_code == expected_response:
return hostname + " passed " + scheme + " check!"
else:
return hostname + " failed " + scheme + " check!"


Finally, we can check a given hostname:

def check_host(hostname):
print expect_check(301, hostname, "http")
print expect_check(200, hostname, "https")


This would let us make calls like

check_host("traviscj.com")


which would print out something like

traviscj.com passed http check!
traviscj.com passed https check!


With that in place, we need some definition of which hosts we want to check. Obviously, we could just define it in the python source, but it'd be nice to make it a bit more reusable. So let's define a simple JSON file that includes the hostnames that I want to ensure work correctly. Here's a sample of a few of them:

[
{
"host": "bovineherdhealth.com"
},
{
"host": "priceofth.at"
},
{
"host": "traviscj.com"
}
]


and a tiny bit of file loading logic:

def from_file(hosting_check_filename):
with open(hosting_check_filename) as hosting_check_file:
for entry in hosting_check:
check_host(entry['host'])


and finally, a tiny final bit wiring:

def main():
from_file('/Users/traviscj/hosting-check/hosting_check.json')

if __name__ == "__main__":
main()


And now we have a way to test all the host forwarders.

# inspired by magicmirror

I've been really inspired by the MagicMirror project. The basic idea is getting a piece of mirror glass and putting a monitor and computer behind it, then having a status page show some pertinent information about the day (like weather, calendar, news, etc). So it looks like a regular mirror, but when you look closely, it shows the extra information.

I'd like to put one in the bedroom to replace our tall mirror. That'll be pretty cool, but it requires a bit of extra thought because our magicmirror setup would replace a tall and skinny mirror, so we'll probably just want a monitor behind the top part of the mirror. That will require some extra bracing to hold the monitor up in the frame. I'll need to think about that a while.

To hold me over in the mean time, and get a platform to play with the software, I ordered a Raspberry Pi 2 and small screen. With that, I'll be able to play with it on a small scale first. I was thinking about mounting it inbetween two cabinets in the bathroom, but just realized that might not work well with the steam from showers and soforth.

One thing I'd really like to see out of it would be bank & card balances. I guess I'm a little obsessive about that stuff, but it really would be nice to have it. This desire sent me down a big rabbit hole of trying to get OFX transactions downloaded. Of course, it seems nearly impossible. I was about to give up, but then stumbled across the fantastic mintapi python package.

# individual sealed ibuprofen & zyrtec

Today I made some individually sealed ibuprofen (Advil) and cetirizine (Zyrtec) packets:

I did it by cutting about a two inch strip of vacuum sealing bag, then doing several perpendicular seals far enough apart that I could pop a couple of pills in it.

Why bother? Mostly, I want to keep a few doses in my backpack / car, but don't want them turning into dust. (The softer coat on Kirkland seems especially problematic.) The next best idea is of course just buying the commercial version, but I don't really need 50 packets at a time. I'd prefer to keep more stock in the "more liquid" bottle form. Finally, waterproof can't hurt!

I meant to do double seals, so it'd be easy to separate the cells. I remembered on the first seal, but then I forgot to do this on all the others. So I had to very carefully slice down the middle of the seal. This probably wouldn't be sufficient for food storage, but it seems like it'll be fine for storing dry pills.

I cut small notches on each packet so I might have a chance at tearing them open without tools. I cut the bag almost exactly the same width, but the Ibuprofen bags wouldn't really vacuum because I put two Ibuprofen pills per slot. I had to settle for just sealing them.

It took about 10-12 minutes to do a row of 10 pills the second time I did it. I could probably get faster if I did it a bit more often. Cost calculations clearly reflect consumables only, and in particular don't reflect the cost of the vacuum sealer and other tools.

# first canning adventure: brandied pears

I've been wanting to get into canning for a while. It was finally time to take the plunge. I made brandied pears. One of the really striking things is how simple the ingredients list is:

• pears
• lemon juice
• sugar
• water
• brandy

Here's the setup just before I started:

Allison got me a pressure canner/cooker for Christmas:

It's huge! After I cooked the pears I packed them

and mixed the brandy into the syrup

and eventually got them into the cans, sealed and processed:

They are delicious! I think I'm going to try something next that requires the pressure canner's features, so I've got a few things on my list:

# pragmatic edc

Lots of folks wax poetic about their every day carry kit, and then include things like fishing hooks. That might make a "pragmatic hiking edc" kit, but I'm interested in some of the more realistic things that might happen:

1. Phone is dead and need to call someone (wife, parent, car insurance company, whatever) but don't have the relevant phone number / insurance number / etc.
2. Minor illness like headache, indigestion, things like that.
3. Forgot my wallet at home and need lunch or cab fare. 4.