I bought an antique telegraph sounder a while back, and I’ve been working on a project that will click out emails from my Etsy store when I get an order. I’ve gone through several generations, and come up with something I really like. What follows is a description of my process for going from concept to finished piece. The code & PCB are open-source, and can be found on my github.
When setting up a new Raspberry Pi, it’s helpful to have console access, which could mean hooking up a screen and keyboard to the pi. Another option is to connect your pi to your computer with a USB cable, and connect to the serial console (similar to connecting via SSH or telnet). You need a special cable, but it saves you from having to keep a monitor and keyboard just for the pi.
(cross-posted to blog.tempo-db.com)
At TempoDB, we maintain multiple environments (production, staging, etc), and each environment is in a datacenter (Dallas, Seattle, etc). For the most part, we want strict separation between environments, but we have a growing list of traffic that ought to be allowed to flow between them (see below). We designed a new architectural primitive which allows us to securely permit some traffic, while still blocking everything else.
I picked up a new Mac Mini this Friday to play around with Xen at home. Right now I run a few services off of one server in my apartment, but I’d prefer to have separate VMs for each service, because I find that more manageable.
I put up a handful of small servers with SSH honeypots running, and have been watching who tries to break in. I didn’t publicize the addresses, or point any DNS at them, but they almost immediately got found by hackers across the globe. Here’s a visualization and analysis of the data so far.
(Cross-posted to blog.tempo-db.com)
In addition to our REST API and language-specific client libraries, we now offer the ability to bulk import data by uploading CSVs. The intent of this feature is to support the initial load of large amounts of historical data (many millions or billions of data points). By sending us CSVs (instead of just using our API normally), customers save themselves from having to build and monitor a large one-time job, and the problem is simplified to CSV generation.
I opened my Etsy store in January (about 9 months ago), and have learned a lot about effectively marketing my store. For the first several months I was trying everything I could think of to increase my visitor count, and I did get a ton of people coming into my store, but the effort required on my part was substantial, and 3 months in I lost interest. That’s when things got interesting.
A while back I built my own thermostat using an Arduino, nodejs, and Google Calendar. It worked really well, but when I moved to a new apartment last year I couldn’t use it (because I now have window units instead of centralized heating/AC). I finally got around to putting it back together this weekend, but I had to rip out the (now unused) thermostat code. What was a Google-Calendar-controlled thermostat is now just a thermometer. Not nearly as cool, but I’m at least glad to have the portion that makes sense back up. You can see it here.
The site I’m working on now, deploys as static files. I haven’t put up a non-server-side-dynamic site since high school, so I’m exploring my options. I thought I could just throw the whole thing up on Amazon S3, but was surprised that it was slower than the current setup (nginx on Linode). I have been reading about the importance of fast load speeds on conversion, google ranking, etc (for example), so speed is a big priority for me. Here’s how I cut my site’s page load time down from around a second to around 500ms.