# stocks and options from 30k feet

One of my friends at work asked me if I had any book recommendations for learning about stocks and options. Mentally, I break trading down into two general classes of trading: index-type and "exotic" trading. By exotic trading, I mean picking individual stocks/options and actively trading. This runs counter to the more conservative buy-and-hold, index-based, hands-off approach.

For the exotic trading, I learned most of what I know from a class with Professor W.E. Olmstead and his book, Options for the Beginner and Beyond: Unlock the Opportunities and Minimize the Risks. For the option-uninitiated, the basic idea is that instead of buying or selling stocks directly, you buy and sell contracts that give you the right (but not obligation) to buy or sell the stock at a particular price by a particular date. That's a mouthful and options are indeed subtle beasts, but they allow the flexibility to either hedge risks you want less exposure to, or increase/leverage exposure to risks you do want to take.

Here's one pretty basic example, called the "Covered Call" strategy. Suppose you:

• buy 100 shares of SQ (currently trading at \$11.12 per stock share or$1112 total)
• sell 1 DEC16-16 \$13 SQ call contract (currently trading at \$.25 per option share, or \$25 per contract) You get the$25 account credit when you sell the initial shares; your eventual profit vs loss on this trade breaks down as follows:

• if the stock price is above $13, the option is likely to be exercised, which means you will be paid \$13 per share for each share of the contract ($1300 overall) and the counterparty takes ownership of the shares. • If the stock price is below$13, the option is likely to stay unexercised, so you keep the 100 shares of the stock and the original $25. This sale can be repeated as often as the option expirations are available, which is usually every month. Exceptions happen in both directions, though: some (especially newer) stocks have less frequent option expirations, while some (extremely high-volume) stocks have more frequent option expirations -- as often as weekly! Furthermore, in the unexercised case, you don't need to buy the stock again since you still have it from the last round. With enough iterations of this, one can even pay off the original stock purchase purely from writing the covered calls. That's not to say this is easy, though! If the market considered it impossible for the stock to go that high, nobody would buy the option contract. So the trick to this strategy is finding a stock and predicting the movement, and actually being correct. As discussed above, you can use Google Finance to view the stock price history and the stock option price chain: These quotes, however, can be delayed; most reputable option trading companies will provide an up-to-the-second view of the option chains. Turns out that finding, predicting, and being correct are pretty hard, and a fair bit of distraction, though. So instead, I mostly opt for the boring route: investing in index funds and riding the standard stock-market growth curve. As best as I can tell, this is pretty close to the view espoused by Bogleheads.org(though I admittedly haven't read the book): Invest a steady stream of earnings in an index fund with as low of management fee as possible, and let it ride as long as possible. Index funds are a kind of "synthetic" stock that tracks multiple other stocks. For example, QQQ tracks the performance of the Nasdaq-100 index, the DIA tracks the performance of the Dow Jones Industrial Average, and SPY tracks the performance of the S&P 500. Disclaimers: The SQ stock quotes listed above are just illustrative examples and not trading recommendations, though I am long SQ as of the post date of this article. # streaks vs statistical streaks Hacker News et al are obsessed with streaks, but I think they have some problems: 1. A single regression resets to zero. 2. There's not an easy way to gradually ramp up your streak-commitment over time. I prefer a different approach: statistical streaks. Suppose I made a commitment to do something differently on 2016-08-26, and did it for the next 5 days; then my 30-day statistical streak avg = 0.166, but my 5-day statistical streak avg = 1.0. If I then have one off-day (5 consecutive days then one failure day) the 30-day statistical streak avg is still 0.166, but the 5-day statistical streak avg drops to 0.80. If I pick it back up the next day, it goes to 0.2 and 0.80 for the 30- and 5-day statistical streak averages (respectively), while if I fail again, it drops to 0.1667 and 0.6 for the 30- and 5- variants, respectively. Here's a tiny bit of python to demonstrate these scenarios:  assumed_history = 0 explicit_recent_history = [1,1,1,1,1,0,0] for window_days in [30, 5]: # lazy... history = [assumed_history]*(window_days)+explicit_recent_history recent_history = history[-window_days:] success_days = sum(recent_history) print window_days, float(success_days) / window_days , recent_history  # i fucking love mangoes Specifically, the Philippine Brand Dried Mangoes. They are amazingly good. But there is a slight problem. Costco sells them in 30 oz bulk packs, which is 20 servings at 160 calories/serving or 3200 calories total. Left to my own devices and lack of self discipline, I could probably eat (and subsequently regret eating) the entire bag of mangoes. I need a self-check to disgust myself and impose a bit more of a speedbump than an already-opened bag provides. So I fired up the FoodSaver and made individual serving sized-bags: Each was individually weighed to be one serving: I ended up making 20: # switching over to https One of the things I've been meaning to do forever is switch things over to https. By "things", I mean the set of websites I run for some family and friends. I tried it out with my personal website first, then flipped over the rest. ## implementation notes 1. I used the letsencrypt start guide to generate the certificates. 2. Modified the nginx config to: a. serve ssl/https traffic on port 443 for the given domain with the proper https certificates/etc. b. forward non-ssl/http traffic on port 80 to port 443 for the given domain ## verification It turns out that the nginx configuration files are a little bit error prone. This probably means that I am doing something wrong, like not using some configuration management tool like puppet or ansible or whatever. But for something as small scale as my site, it doesn't really meet the cost-benefit threshold for learning a new tool/language. I also even considered spinning up a simple one-off configuration generator that I'd need to figure out how to override and extend as needed. My first step was to write a simple requests call that can get the response given a scheme and hostname: def check_scheme_host(scheme, hostname): url = "{scheme}://{hostname}/".format( hostname=hostname, scheme=scheme, ) try: resp = requests.get(url, allow_redirects=False) except Exception: return None return resp  Note that our check disallows redirects, because otherwise requests would just helpfully follow the 301s. Since the 301s are part of what we're trying to verify, we don't want to follow them. Given that that function, we can define our expectations for a given check. In particular, we can define a function that expects a certain response code for a given hostname and scheme: def expect_check(expected_response, hostname, scheme): actual_response = check_scheme_host(scheme, hostname) if actual_response != None and actual_response.status_code == expected_response: return hostname + " passed " + scheme + " check!" else: return hostname + " failed " + scheme + " check!"  Finally, we can check a given hostname: def check_host(hostname): print expect_check(301, hostname, "http") print expect_check(200, hostname, "https")  This would let us make calls like check_host("traviscj.com")  which would print out something like traviscj.com passed http check! traviscj.com passed https check!  With that in place, we need some definition of which hosts we want to check. Obviously, we could just define it in the python source, but it'd be nice to make it a bit more reusable. So let's define a simple JSON file that includes the hostnames that I want to ensure work correctly. Here's a sample of a few of them: [ { "host": "bovineherdhealth.com" }, { "host": "priceofth.at" }, { "host": "traviscj.com" } ]  and a tiny bit of file loading logic: def from_file(hosting_check_filename): with open(hosting_check_filename) as hosting_check_file: hosting_check = json.load(hosting_check_file) for entry in hosting_check: check_host(entry['host'])  and finally, a tiny final bit wiring: def main(): from_file('/Users/traviscj/hosting-check/hosting_check.json') if __name__ == "__main__": main()  And now we have a way to test all the host forwarders. # inspired by magicmirror I've been really inspired by the MagicMirror project. The basic idea is getting a piece of mirror glass and putting a monitor and computer behind it, then having a status page show some pertinent information about the day (like weather, calendar, news, etc). So it looks like a regular mirror, but when you look closely, it shows the extra information. I'd like to put one in the bedroom to replace our tall mirror. That'll be pretty cool, but it requires a bit of extra thought because our magicmirror setup would replace a tall and skinny mirror, so we'll probably just want a monitor behind the top part of the mirror. That will require some extra bracing to hold the monitor up in the frame. I'll need to think about that a while. To hold me over in the mean time, and get a platform to play with the software, I ordered a Raspberry Pi 2 and small screen. With that, I'll be able to play with it on a small scale first. I was thinking about mounting it inbetween two cabinets in the bathroom, but just realized that might not work well with the steam from showers and soforth. One thing I'd really like to see out of it would be bank & card balances. I guess I'm a little obsessive about that stuff, but it really would be nice to have it. This desire sent me down a big rabbit hole of trying to get OFX transactions downloaded. Of course, it seems nearly impossible. I was about to give up, but then stumbled across the fantastic mintapi python package. # individual sealed ibuprofen & zyrtec Today I made some individually sealed ibuprofen (Advil) and cetirizine (Zyrtec) packets: I did it by cutting about a two inch strip of vacuum sealing bag, then doing several perpendicular seals far enough apart that I could pop a couple of pills in it. Why bother? Mostly, I want to keep a few doses in my backpack / car, but don't want them turning into dust. (The softer coat on Kirkland seems especially problematic.) The next best idea is of course just buying the commercial version, but I don't really need 50 packets at a time. I'd prefer to keep more stock in the "more liquid" bottle form. Finally, waterproof can't hurt! I meant to do double seals, so it'd be easy to separate the cells. I remembered on the first seal, but then I forgot to do this on all the others. So I had to very carefully slice down the middle of the seal. This probably wouldn't be sufficient for food storage, but it seems like it'll be fine for storing dry pills. I cut small notches on each packet so I might have a chance at tearing them open without tools. I cut the bag almost exactly the same width, but the Ibuprofen bags wouldn't really vacuum because I put two Ibuprofen pills per slot. I had to settle for just sealing them. It took about 10-12 minutes to do a row of 10 pills the second time I did it. I could probably get faster if I did it a bit more often. Cost calculations clearly reflect consumables only, and in particular don't reflect the cost of the vacuum sealer and other tools. # first canning adventure: brandied pears I've been wanting to get into canning for a while. It was finally time to take the plunge. I made brandied pears. One of the really striking things is how simple the ingredients list is: • pears • lemon juice • sugar • water • brandy Here's the setup just before I started: Allison got me a pressure canner/cooker for Christmas: It's huge! After I cooked the pears I packed them and mixed the brandy into the syrup and eventually got them into the cans, sealed and processed: They are delicious! I think I'm going to try something next that requires the pressure canner's features, so I've got a few things on my list: # pragmatic edc Lots of folks wax poetic about their every day carry kit, and then include things like fishing hooks. That might make a "pragmatic hiking edc" kit, but I'm interested in some of the more realistic things that might happen: 1. Phone is dead and need to call someone (wife, parent, car insurance company, whatever) but don't have the relevant phone number / insurance number / etc. 2. Minor illness like headache, indigestion, things like that. 3. Forgot my wallet at home and need lunch or cab fare. 4. # recent projects - sous vide, hydroponic pump, and another table I've been working on a few fun projects recently. First: I built a temperature controller out of a crock pot and a cheap import temperature controller. It looks like this: I've used it to cook eggs and steak. Both turned out great! Here's the only shots I got of the steak: Next, I've put together another refinement of the hydroponic tomato setup we have in our living room. Also, just realized I haven't added the backposts about it yet. Allison and I started three tomato plants back in May: They've grown into this: That is a thirsty plant, so I rigged up a small electric pump to refill the bucket from a larger water container below: Finally, I've been wanting some more desk space at home, but haven't wanted to get a full size desk replacement. I wanted to make something with the same construction as the coffee table project, but was wary of starting another hand-sawing project. So I got one of the Makita circular saws! It is impossible to overstate how much easier it made the project. The original coffee table took several days of work and hand sawing, while this time I had done most of the cutting in about a half hour! Then I assembled the sides: and put the top on, then put it in place: I haven't stained it yet, but wanted to get a feel for fit and usefulness before that. One of the first things I realized was that it was really dark on the shelf. A long while back, Mom had given me some snazzy LED strip lights, so I gave them a try: # build_json.sh This might seem silly, but I've beeing playing with some json.sh scripts that build legitimate json bodies and are easily filled into a shell script variable as needed. The basic driving idea was that there are lots of slick ways to pull data out of JSON(either by programming something with python's json or running a command line tool like jq or whatever), but not as many friendly ways to build some JSON out of a given token or whatever. Often, you have a list of identifiers and you need to build a bunch of JSON blobs from that list. For example, say we have a file called things that contains thing1 thing2 thing3 thing4  and we need to generate {"value":"$THINGTOKEN"}


for each token in things. Then we can simply run a tiny shell command like

Traviss-MacBook-Pro% while read token; do
> echo $(newBuilder | set value$(quoted \$token) | build)
> done < things
{"value":"thing1"}
{"value":"thing2"}
{"value":"thing3"}
{"value":"thing4"}


Easy as that! There's no waiting for heavy VMs to start up or anything like that, just run it.