Whether it's the fireplace, the geforce, or sunny surf sessions, xmas vacation has been about staying warm.
Furnace adventure
Last year I did a rewire of the furnace and expected everything to be good to go upon relighting the pilot this year. No such luck; the heat wouldn't turn on. After shorting the thermostat to ensure the 24v equipment was functional, reconnecting the thermostat improved things a bit - the heat would turn on but not off unless I used the physical switch.
There were two things to chase down: the 120v -> 24v transformer that runs the gas valve and the thermostat itself.
The 50-year old transformer was putting out something like 26v and I had no idea if that was within spec.
A new transformer was cheap, a new thermostat was a bit pricier. I replaced both, thinking that maybe a hot transformer output blew up the thermostat electronics (which would explain the hard shut off working but not the electronic switch). All is well now.
Meme strats
The bomb car
PUBG (80%) bot league has opened the door to some slightly looser play. We've gone from meme drops (Hacienda Heist, Ferry Pier, Cave, Train) to meme strategies - mostly involving C4.
Being the only wall-penetrating, large-AOE weapon in PUBG, C4 is great for clearing out players that are holed up in buildings. But you can also stick it to a vehicle and thereby surprise your opponents... just as long as you're not particularly attached to your own fate.
The holy grail is to get a chicken dinner using this tactic. Shane almost pulled it off using the Gold Mirado, but alas he was thrown off track by the moguls.
The bridge ambush
In the early days of PUBG you could set up a roadblock on a mil base bridge and catch teams trying to cross. It was rarely worthwhile since your backside would be easy pickings for teams already on the in-circle side. Still, the addition of spike strips has makes this play worthwhile simply for the lulz.
We managed to set up an ambush that was so intimidating it had our victims jumping over the side. Funny, but not as good as the sparks from blown tires.
Vids
I've moved on from chronicling lengthy chicken dinner runs to compiling all the best of the shenanigans - good and bad. It's way to much effort for a single chuckle from just the people that were there, but pandemic.
Autoencoderish
I picked back up on autoencoder-related Keras stuff. I had previously thought that various image processing applications could benefit from having a larger patch of the input image than it was expected to generate.
My early attempts would convolve and max pool, then transpose convolve back up to the output size. Sometimes they would pass through a flattening layer - I've seen both ways in the examples. Frustratingly, the output guesses seemed to always resemble the full input image. It's like the network wouldn't learn to ignore the fringe input that I was providing to be helpful.
In retrospect, the solution was pretty obvious; do some convolutions on the whole thing, but eventually just crop the feature maps down to the output size. My hope was that the cropped part would benefit from the hints provided by the framing in the convolution layers before the cropping layer.
Autoencoders were used for noise reduction and image repair before GANs and other, better models were developed. I decided to train my not-really-autoencoder to try to fill in a 64-pixel patch of noise in the semi-consistent input set of mtg cards.
Iterating on the model a few times, the best I could come up with was a network that would make a pretty good guess for the edge of the noisy patch that seemed to benefit from knowing the frame input.
I sold a couple calls on the Ericsson I already owned. They exercised and made some modest gains on the premium and sale price. An MJ CSP expired, but I wouldn't mind having the shares. After jumping to $30 on meme energy, PLTR has fluctuated in the mid 20s. This wheeled from exercised put to exercised call in about a week. We will eventually get volatility and it will likely be when I'm not still in it, but I've been keeping a position in VXX and/or UVXY.
High yield bonds have again shown they are strangely not bad short term investments - at least right now. I don't think I got into bond until March-ish. Abbvie was rumored to have a covid treatment way back and so I was sitting on shares, it made sense to sell a call above my purchase price. Premiums on AMD have been pretty high for a $90-ish stock so I sold a CSP there.
Back in November it wasn't clear who would be first to market with a covid vaccine, so I bought a few calls on the leading candidates. They didn't go much of anywhere because the anticipation was pretty well priced in. Sitting on the underperforming RKT and volatility indexes, I've managed to sell a few calls that would still keep me in the black if they exercised.
UPS has done a good job, but I think I'm over owning shares in volumes that don't end in -00. The first week of December I pocketed the premiums of a weekly MSFT and TSM cash-secured puts as well as the sale of some PFE that I had from the week before.
I expected a market pullback by November or December and had some longer-dated puts on SPY and JETS. But the market keeps on keeping on. The bearishness was balanced by my faith in CLX, PLTR, RKT, and XLE.
Nordstrom bonds rallied from $89 to $99 so it seemed sensible to cash out of a position with a mere 4% return. I happily broke even on longstanding small investments in Activision and Nvidia and saw some MJ calls exercise a few bucks higher than what I paid.
Tweaking the TensorFlow implementation of Neural Style Transfer.
Related / external
Risky click advisory: these links are produced algorithmically from a crawl of the subsurface web (and some select mainstream web). I haven't personally looked at them or checked them for quality, decency, or sanity. None of these links are promoted, sponsored, or affiliated with this site. For more information, see this post.
Neural networks are a powerful tool in machine learning that can be trained to perform a wide range of tasks, from image classification to natural language processing. In this blog post, well explore how to teach a neural network to add together two numbers. You can also think about this article as a tutorial for tensorflow.