Infopost | 2020.12.25

PUBG buggy C4 kill explosion

Whether it's the fireplace, the geforce, or sunny surf sessions, xmas vacation has been about staying warm.

Living room Christmas tree Holiday card
Furnace adventure

furance switch wiring diagram fan motor power supply transformer limit

Last year I did a rewire of the furnace and expected everything to be good to go upon relighting the pilot this year. No such luck; the heat wouldn't turn on. After shorting the thermostat to ensure the 24v equipment was functional, reconnecting the thermostat improved things a bit - the heat would turn on but not off unless I used the physical switch.

Two wire thermostat diagram heater furnace transformer

There were two things to chase down: the 120v -> 24v transformer that runs the gas valve and the thermostat itself.

Old electronics transformer heater furnace weimaraner

The 50-year old transformer was putting out something like 26v and I had no idea if that was within spec.

Heater furnace transformer wires

A new transformer was cheap, a new thermostat was a bit pricier. I replaced both, thinking that maybe a hot transformer output blew up the thermostat electronics (which would explain the hard shut off working but not the electronic switch). All is well now.
Meme strats

The bomb car

PUBG meme strat C4 gold Mirado suicide bomb Miramar

PUBG (80%) bot league has opened the door to some slightly looser play. We've gone from meme drops (Hacienda Heist, Ferry Pier, Cave, Train) to meme strategies - mostly involving C4.


Being the only wall-penetrating, large-AOE weapon in PUBG, C4 is great for clearing out players that are holed up in buildings. But you can also stick it to a vehicle and thereby surprise your opponents... just as long as you're not particularly attached to your own fate.

PUBG gold Mirado C4 suicide bomb stategy final circle

The holy grail is to get a chicken dinner using this tactic. Shane almost pulled it off using the Gold Mirado, but alas he was thrown off track by the moguls.

The bridge ambush

PUBG bridge ambush spike strip motorcycle ghillie suit Miramar

In the early days of PUBG you could set up a roadblock on a mil base bridge and catch teams trying to cross. It was rarely worthwhile since your backside would be easy pickings for teams already on the in-circle side. Still, the addition of spike strips has makes this play worthwhile simply for the lulz.


We managed to set up an ambush that was so intimidating it had our victims jumping over the side. Funny, but not as good as the sparks from blown tires.

Vids


I've moved on from chronicling lengthy chicken dinner runs to compiling all the best of the shenanigans - good and bad. It's way to much effort for a single chuckle from just the people that were there, but pandemic.

PUBG shananigans jumps package Gold Mirado buggy vehicle kill Miramar
Autoencoderish


I picked back up on autoencoder-related Keras stuff. I had previously thought that various image processing applications could benefit from having a larger patch of the input image than it was expected to generate.

My early attempts would convolve and max pool, then transpose convolve back up to the output size. Sometimes they would pass through a flattening layer - I've seen both ways in the examples. Frustratingly, the output guesses seemed to always resemble the full input image. It's like the network wouldn't learn to ignore the fringe input that I was providing to be helpful.

In retrospect, the solution was pretty obvious; do some convolutions on the whole thing, but eventually just crop the feature maps down to the output size. My hope was that the cropped part would benefit from the hints provided by the framing in the convolution layers before the cropping layer.

_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
conv2d (Conv2D)              (None, 192, 192, 32)      896
_________________________________________________________________
batch_normalization (BatchNo (None, 192, 192, 32)      128
_________________________________________________________________
gaussian_noise (GaussianNois (None, 192, 192, 32)      0
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 192, 192, 32)      9248
_________________________________________________________________
dropout (Dropout)            (None, 192, 192, 32)      0
_________________________________________________________________
cropping2d (Cropping2D)      (None, 64, 64, 32)        0
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 64, 64, 32)        4128
_________________________________________________________________
batch_normalization_1 (Batch (None, 64, 64, 32)        128
_________________________________________________________________
dropout_1 (Dropout)          (None, 64, 64, 32)        0
_________________________________________________________________
conv2d_3 (Conv2D)            (None, 64, 64, 64)        8256
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 64, 64, 64)        36928
_________________________________________________________________
dense (Dense)                (None, 64, 64, 3)         195
=================================================================
Total params: 59,907
Trainable params: 59,779
Non-trainable params: 128
_________________________________________________________________

Autoencoders were used for noise reduction and image repair before GANs and other, better models were developed. I decided to train my not-really-autoencoder to try to fill in a 64-pixel patch of noise in the semi-consistent input set of mtg cards.


Iterating on the model a few times, the best I could come up with was a network that would make a pretty good guess for the edge of the noisy patch that seemed to benefit from knowing the frame input.

_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
...
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 64, 64, 64)        36928
_________________________________________________________________
conv2d_5 (Conv2D)            (None, 64, 64, 3)         195
=================================================================
Total params: 59,907
Trainable params: 59,779
Non-trainable params: 128
________________________________________________________________

I also satisfied my curiosity about output layers - dense vs a 1x1 convolution. The results seemed to be indistinguishable.


I don't think there's any interesting place to take this, but it was neat to see the framing idea validated.
Trading

Stock company valuations rug pull crash tweet Sven Henrich

Another month in wheels and yolos.


I sold a couple calls on the Ericsson I already owned. They exercised and made some modest gains on the premium and sale price. An MJ CSP expired, but I wouldn't mind having the shares. After jumping to $30 on meme energy, PLTR has fluctuated in the mid 20s. This wheeled from exercised put to exercised call in about a week. We will eventually get volatility and it will likely be when I'm not still in it, but I've been keeping a position in VXX and/or UVXY.


High yield bonds have again shown they are strangely not bad short term investments - at least right now. I don't think I got into bond until March-ish. Abbvie was rumored to have a covid treatment way back and so I was sitting on shares, it made sense to sell a call above my purchase price. Premiums on AMD have been pretty high for a $90-ish stock so I sold a CSP there.

Back in November it wasn't clear who would be first to market with a covid vaccine, so I bought a few calls on the leading candidates. They didn't go much of anywhere because the anticipation was pretty well priced in. Sitting on the underperforming RKT and volatility indexes, I've managed to sell a few calls that would still keep me in the black if they exercised.


UPS has done a good job, but I think I'm over owning shares in volumes that don't end in -00. The first week of December I pocketed the premiums of a weekly MSFT and TSM cash-secured puts as well as the sale of some PFE that I had from the week before.


I expected a market pullback by November or December and had some longer-dated puts on SPY and JETS. But the market keeps on keeping on. The bearishness was balanced by my faith in CLX, PLTR, RKT, and XLE.


Nordstrom bonds rallied from $89 to $99 so it seemed sensible to cash out of a position with a mere 4% return. I happily broke even on longstanding small investments in Activision and Nvidia and saw some MJ calls exercise a few bucks higher than what I paid.



Related - internal

Some posts from this site with similar content.

Post
2021.01.02

The next break

Reflecting on investments and Warren B. Some sunset surf shots, video games, and AI image stylization.
Post
2022.08.19

Rewiring

After some interesting reads, I implemented a convolution+pooling block inspired by ResNet. It looks like this:
Post
2020.11.29

Mods

Since it was just the two-ish of us, Jes and I went to the Lodge for Thanksgiving lunch.

Related - external

Risky click advisory: these links are produced algorithmically from a crawl of the subsurface web (and some select mainstream web). I haven't personally looked at them or checked them for quality, decency, or sanity. None of these links are promoted, sponsored, or affiliated with this site. For more information, see this post.

polukhin.tech

Lightweight Neural Network Architectures | Andrii Polukhin

As the field of Deep Learning continues to grow, the demand for efficient and lightweight neural networks becomes increasingly important. In this blog post, we will explore six lightweight neural network architectures.
404ed
blog.risingstack.com

AI Development Tools Compared - The Differences Youll Need to Know - RisingStack Engineering

Artificial intelligence is a complex field. See how different AI development tools compare and find the best one for you.
404ed
financialsamurai.com

How To Pay No Capital Gains Tax After Selling Your House

Learn how to pay little to no capital gains tax after selling your primary home for big profits. Use the $250K / $500K profit exclusion rule.

Created 2024.04 from an index of 165,831 pages.