I've Been Watching You
Growth and Monetization
Published: 10/23/2020
Subscribe to Ja3k
That's right this blog is now a newsletter.
Thanks for subscribing!
Growth
A change to my blog that didn't quite fit into my last post is that I started logging access to my website sometime in February. I thought now would be a good time to look at the logs and see what I can learn about my audience.
The first thing that jumps out is there are 28602 total logs! But that doesn't actually correspond to page visits. Every request is logged. Because a visit to any blog post always requests the css and for some pages additionally requests many images, this number overstates my traffic. The next thing I notice is a lot of logs are requesting my robots.txt file. That accounts for 3583 logs. Over 10%! I should probably make a robots.txt file to please all those robots. It's my belief that I don't need one if I'm fine with everyone crawling everything. But maybe the more timid bots are waiting for permission and I'm missing out on some SEO value.
The question "How many people are reading my blog" is somewhat harder to answer. I investigate the related question of how often each blog post is GOTTEN
. I made a short python script [1] to search the logs and see how many times each blog was fetched and break down the GETS
by day during the week of release and by week after that.
Plotly makes it easy to make some quick graphs and then export them as pngs or just the raw html [2]. You can see my graphs of blog visits during their first week and week by week afterwards below. Note the colors are vaguely correlated to categories, for instance red lines are mtg posts. Also note you can hide a line by clicking it or isolate it by double clicking it. That's fairly useful because the data is dominated by my 2xm review.
Some thoughts and observations on the data:
- Unsurprisingly almost all blogs are most read on the day of release and then readership gradually tapers off. In the case of the exceptions its generally the case that I didn't tweet about it on the day of publication. The podcasts actually spike the next day. Maybe people were saving them for their daily commutes?
- My most read post by far is my review of 2xm. It's probably thanks to it getting more retweets on twitter than my normal post. I also chose a pretty click baity title for that tweet. I'm not sure why my znr post did so much worse. Maybe my friends just weren't interested in spotting me any more clout. I sort of thought it was much better than my 2xm post thanks to the embeded party simulator. Maybe 2xm is just too much cooler than ZNR. Or maybe it's because the autocarding is broken on that page. I need to figure out what's going on there. Is autocard dead?
- Speaking of my ZNR review I made a simulator only page which I posted to r/lrcast and r/spikes getting about 136 total upvotes. The page was visited 163 times. This surprised me because I would have expected far more people to click than upvote. I guess it's easier to upvote than click.
- My second most read post on day of release was Rethinking my Webpage Generation. This sort of surprised me since I thought it was a very niche post. Maybe I should post more about tinkering with my site. I certainly do a lot of tinkering. I haven't analyzed last week's performance yet so I don't know if my subsequent design doc did as well.
- Blog posts get more reads in the first week than subsequent ones but after the first week there isn't much of a downward trend.
- Since February I've had my blogs requested 5905 times. I'm not sure how that relates to real users since of course there are some bots. I'm also not sure how many unique visitors that corresponds to. I think I should be able to count unique devices by looking at the logs but I haven't yet. It's sort of strange, the number feels extremely small in the context of the internet. But in another sense it's huge. I definitely didn't talk to that many people face to face in 2020. Measured in number of interactions this blog constitutes the majority of my socialization in 2020. But it certainly doesn't feel like that. I don't think my mind is well equipped to deal with the internet.
Monetization
A very unpopular blog post I wrote was my Amazon Affiliate Link Spam Post. And indeed only one person actually clicked a link and bought something. Shortly afterwards Amazon canceled my account since you need to drive three sales in 180 days for them to confirm your account. I don't know why the idea of making money doing silly things is so appealing to me. The same motivations which lead me to charge limes, play microstakes poker, gamble on predictit and play mtg lead me to try out Amazon Affiliate Marketing Links. I wish I got as excited about actual work.
I guess there are a lot of other ways to monetize my sweet content. I could try out a patreon or a kickstarter or google ads or a buy me a coffee link. I don't think I'll ever make an amount of money I care about, or even cover the 68 cents a month I'm paying to Amazon to host the site, but I guess I'd learn some things and have some new experiences. Which I guess is the point of the site. It'd add more life to this fantasy that I can make a career independently doing whatever I want.
About a year ago I watched Julie & Julia which tells the story of Julia Child, who wrote a famous cooking book and Julie Powell who wrote a cooking blog in 2002 and became famous. It feels like such a different world when you could write some random stuff on the internet and get famous as if everyone wasn't simultaneously hacking away at their keyboard. It's so strange that a hundred years ago writing something and distributing it to this many people would have been the work of a small team and a few hundred years before that it would have been simply impossible. And now I can do it alone in my spare time and it's thoroughly uninteresting.
I don't know exactly why I've been writing this blog. But when I see birds collecting trash for their nests I think I'm close.
[1] Please no comments about how I did it in O(logs*blogs)
time when I could have done it in O(logs+blogs)
time. When I have a large enough corpus to make it matter I'll optimize the script.
[2] Actually there are two rather annoying things about exporting the html. The first is when I embedded it into the page it covers the graph with these option menus which made it impossible to play with the data. I fixed that by finding the menu's id and hiding it. The second is that when you have a small screen, like a phone, the legend simply covers the graph. I fixed that by hiding the legend in my mobile css. Probably I should figure out how to put the legend below though. I suppose it's my own fault for putting 20 lines on one graph.