How It Works

So how does this database synchronization work? Here’s the scheme, which I made up myself, so it may be completely beyond the pale. We shall find out.

The first piece is a table called sync_hosts that stores the ids of other hosts, the time they were last synced, and whether or not a sync is in progress.

Then each table indicated in the config file has two additional tables created to store metadata. These tables are suffixed with _map and _sync, so if the table is called “table”, then the two additional tables are called “table_map” and “table_sync”. In this initial version, it will be left to the user to ensure that table name conflicts are avoided.

The sync table

The sync table contains the primary keys of the original table, plus three additional columns: “created” (timestamp), “modified” (timestamp), and “deleted” (tinyint(1)). These columns contain, as one might suspect, the creation time, modification time, and whether or not the row has been deleted.

The map table

The map table contains two copies of the primary keys, plus a remote host id. This table allows one to look up what the primary keys of a row are on a remote host. This is necessary because each host will be creating different rows at different times, so the primary keys will only be the same by coincidence.

How it all works

The basic scheme is this. We install triggers on all the tables such that whenever a row is created, modified, or deleted in the original table, a corresponding row is created or updated in the sync table.

Then, when it is time to sync, we get all the rows from the sync table that have been created or updated since the last sync time. We take these rows over to the remote host, look them up in the map table, and make the updates (or create the rows if they aren’t found in the map table). We then do the entire operation in reverse, so that all of the updates from the remote host are brought to the local host as well. When we are doing our updates, we check the sync table of the destination host so that we don’t write over any changes that are newer than ours.

That’s about it! Things will surely get more complicated as we get deeper into this, and we may have to make more rules to make it work. Or it may not work at all! Who knows?

dbsync

Time for a new project! This one is an old one, I wrote it in Java a long time ago… maybe 2009 I would say. I never quite got it working, and I think part of the problem was that I had no automated testing infrastructure.

Automated testing is something that has been elusive in my career. People like to talk about it, particularly during interviews, but very few people actually do it. I’ve only worked at one company that was really serious about it. Other places I’ve tried to introduce it myself, only to be told “we don’t have time for that”. It seems to me that if you don’t have time to do it, then you certainly don’t have time not to do it. Because that will end up taking more time in the long run. But we seem to get by without it somehow.

This project is a perfect candidate, though. It is complex enough that bugs will become intractable without some kind of testing infrastructure, and it lends itself well to verification. The basic premise of the project is to synchronize two relational databases. To simplify this task there are a few rules:

  • The databases are expected to have the same table structure
  • Synchronization is limited to a configured list of tables
  • There are no circular foreign key references
  • Synchronization follows a “last update wins” model

I am sure other rules will be added as things get more complicated, but that’s a start. The reason it lends itself well to testing is because we can set up our test databases with a script; synchronize them; and then dump the results to a text file that can be compared to an expected result. We may need to refine this approach, but as with our list of rules, it is a good start. Follow the progress here and on github.

npm audit

I recently acquired a new computer, and set about to see if I could re-create my python/react setup by checking out my source code from github.

Everything went pretty smoothly, except when I run npm audit fix I get this puzzling output:

PS C:\blah> npm audit fix

up to date, audited 2022 packages in 3s

142 packages are looking for funding
  run `npm fund` for details

# npm audit report

react  <0.14.0
Severity: high
Cross-Site Scripting - https://npmjs.com/advisories/1347
fix available via `npm audit fix`


1 high severity vulnerability

To address all issues, run:
  npm audit fix

npm audit fix is doing nothing, and then telling me to run npm audit fix. Something is not right here!

I tried removing package-lock.json and node-modules and running npm install, which upgraded a few things – that’s nice, I suppose – but I end up in the same situation.

The strange thing is, when I check npm list react, it shows react 16.14.0! There is no secret version of react <0.14.0 anywhere. I checked npm list -g react just for good measure, and it’s empty. What gives?

When I install react/react-dom 16.14.0 into an empty directory, the vulnerability isn’t there. So it must be one of my other dependencies – but I remove them one by one, and I get all the way down to just react and react-dom and the vulnerability is still there! What?? Okay, so what if I take the directory without the vulnerability, and add stuff to it until it looks like my project? What then, npm audit boy??

The first thing I do is rename the directory, and bam, that’s it. The vulnerability is in both places now. I have chosen an unfortunate name for my directory of react code: “react”! npm audit must think that I’m inside a react package of version 0, and so it complains about version <0.14.0.

I suppose I should think of a better name for that directory… but at least I know what’s going on now.

Payout

At the end of our last episode, we were waiting to see if Slush Pool would recognize our sad little GPU miner. Well it did, so now the question is – when do we get our money?!?

Slush Pool has what is called a “Payout Threshold”, which is the amount we need to accumulate in order to get some Bitcoin in our wallet. The default is 0.1 Bitcoin, which as of this writing is about six grand. Fortunately, our dashboard tells us exactly how long it will take to get there at the rate we’re going:

Yes, that’s right… it will take us 1.8 million years to make six grand.

Maybe we’re being a little lofty… how about ten bucks?

Let’s see, 10/6000 = x/1,800,000…. that will take us 3,000 years. I see now why everyone was telling me not to do this.

Let’s say we want to get crazy and go out and buy an ASIC. What’s the cheapest thing we can buy that will pay for itself, and make us a little cash before the earth falls into the sun?

Slush Pool’s payout formula is based on what percentage of their hash rate we can cough up during the search for a block. Their hash rate seems to hover around 4.8 Eh/s. What the heck is an Eh/s? It is Exahashes per second. That’s a quintillion. Super big. It seems like a block is found about every 4 hours 20 minutes on average, and the block value is about 7 Bitcoin. So let’s see, how much money do we want to make every four hours? A dollar? let’s go with a dollar. Start small. A dollar is 0.000016 Bitcoin. So we need to haul in 0.000016/7 = 0.0000022857 of the pool’s hash rate. That is a small number! But it is not as small as a quintillion is big. Put them together and we need 10,971,360,000,000 hashes per second. Let’s see, kilo, mega, giga, tera… I think we’re at tera. 11 Th/s. This seems large. My crappy little GPU miner was pulling in about 88 Mh/s. We need something a hundred thousand times more powerful. This cannot be affordable.

And it isn’t. The first result in a search for “ASIC miner” is the Antminer S19, at 95Th/s, for $16k. For sixteen thousand dollars, we can haul in about two bucks an hour. It will take this unit something on the order of a year to pay for itself. The next “Bitcoin halving” is not for three years, so that leaves a little time to make a profit, I suppose. Who knows what is going to change in the next year though!

The Antminer S9, at 13Th/s, is much more reasonable at around $700 with shipping. We could actually do this. A dollar every four hours, this unit takes about four months to pay for itself, and then it’s printing money after that!

I don’t have $700 lying around at the moment, but I could save that up in a few months. So this will be a summer project. Until then, back to writing some software!

Whoops, new recipe

So it turns out cgminer wasn’t working at all! There was an error message about clCreateCommandQueue that caused it to deactivate the GPU device, and it was just sitting there displaying 0’s all night. I declared victory and went to bed when I saw it was running at all, I figured the 0’s would change to numbers after a while. I spiked the ball on the five yard line, basically.

I tried all of the 64-bit LTS distros back to 12.04 with no luck, and then decided to try 32-bit 16.04.05 just for grins. The AMD drivers won’t compile on this platform, but then I discovered that there are open source versions of all this stuff! So, here is the amended recipe.

sudo apt install autoconf libtool libcurl4-openssl-dev pkg-config ocl-icd-opencl-dev clinfo
sudo apt install mesa-utils // not sure if this one is strictly necessary, might be just for informational purposes e.g. glxinfo
sudo apt install mesa-opencl-icd

Then, download and unpack cgminer 3.7.2, and in that directory:

autoreconf -i
./autogen.sh
CFLAGS="-O2 -Wall -march=native" ./configure --enable-opencl
make
sudo make install // not sure if this is necessary either

And now I see actual numbers. Slush Pool has recognized that the worker exists, so that’s good… no stats showing up yet though. So this may also be a bust. We shall know in the morning, I am spiking the ball and going to bed.

Bitcoin Recipe

Here is the final recipe for how I got this thing working.

Hardware: HP Envy 700 PC with (I think) AMD Radeon R7 240; I say “I think” because after everything is installed, clinfo reports AMD Radeon HD 8500 Series.

OS: Ubuntu 18.04.4 with 5.3.0-28 kernel. The kernel is important – using 5.4.0-70 (the default) doesn’t work.

When installing 18.04.4, turn off “download updates while installing”, and once Ubuntu is running, politely decline any updates that are offered. Then install the extra stuff we need:

sudo apt install autoconf libtool libcurl4-openssl-dev pkg-config ocl-icd-opencl-dev

Download the Ubuntu drivers from AMD and install with sudo ./amdgpu-install -y --opencl=pal,legacy

Then get the cgminer 3.7.2 source and:

autoreconf -i
./configure --enable-opencl
make
sudo make install
cgminer -o <pool url> -u <username> -p <password>

And it is off and running! The pool restarts it several times, downgrading the difficulty from 8192 to 1638 to 512, and then once more because it detected a new block (surely could not have done that without my help). Then it kinda sits there with a bunch of zeroes indicating it is contributing precisely nothing. The Slush Pool dashboard hasn’t registered so much as a blip yet, but surely we need to leave this thing running for a while to see anything. So it’s back to bed for me.

Bitcoin 4

Okay, we need to put this Bitcoin thing to bed. It is now or never.

At the end of our last episode we were considering using mxe. This turns out to be a bit of a rabbit hole… libcurl dev is missing, and there are complicated instructions as to how to fix this. Sounds like a time sink so we are going to try another avenue.

The garbagey old linux box we used in episode 2 had an NVIDIA card, but our garbagey old windows box has AMD, which seems friendlier to opencl. So, I have installed Ubuntu 20.04 alongside windows on the AMD machine, and we’re going to find out if we’ve perhaps made a slightly less garbagey linux box.

As usual we have to install autoconf, libtool, and libcurl4-openssl-dev. Then run autoreconf -i, ./configure --enable-opencl, and make. This brings us the following error:

config.status: executing depfiles commands
config.status: error: in `/home/jborland/Documents/cgminer-3.7.2':
config.status: error: Something went wrong bootstrapping makefile fragments
    for automatic dependency tracking.  Try re-running configure with the
    '--disable-dependency-tracking' option to at least be able to build
    the package (albeit without support for automatic dependency tracking).
See `config.log' for more details

Looking through config.log, it looks like gcc is not understanding some of the options it’s being fed. Furthermore it looks like it’s gcc 9.3.0. Weren’t we using 10-something in cygwin? Well before we upgrade, let’s follow this dependency tracking advice and see if we get any farther.

That seems to work, but it can’t see OpenCL, and AMD’s latest drivers are for Ubuntu 18.04! They won’t install on 20.04. For pete’s sake. Thanks to this tip we locate an older version that doesn’t refuse to install on a newer OS. This card is older than dirt anyway, right? How much could this stuff really have changed since 16.04.

Well that doesn’t fly, unsurprisingly I suppose. So let’s be good little girls and boys and start over with 18.04, which the newer AMD drivers should play nice with.

I used an old 18.04 install disk, and turns out it has a 4.15 kernel that AMD doesn’t like! So, I upgrade to the 5.4 kernel, and uninstall and reinstall the AMD drivers. This quickly turns into a dumpster fire.

Installing 1804.4 from the outset seems to be the way to go. And I also discover that I need to install pkg-config for this all to work.

And ocl-icd-opencl-dev.

IT’S ALIVE! I will put the whole recipe together in a new post.

Bitcoin 3

At the end of our last episode we were working on getting cgminer running on our garbagey windows box. For this we are using cygwin, which in my humble estimation is an absolutely awesome and essential piece of software.

I won’t go into the details of cygwin, beyond that when you install it, you should select libtool, autoconf, and OpenCL as additional components in the installer. From there it is pretty much like the other platforms; autoreconf -i, ./configure --enable-opencl, make, make install. The one difference is that before the make, I had to export CFLAGS="-fcommon" otherwise there are a bunch or errors about redeclared symbols or some such.

This gets cgminer to the point where it can run, but we immediately get the following error:

 [2021-04-07 18:42:27] Started cgminer 3.7.2
 [2021-04-07 18:42:27] clDevicesNum returned error, no GPUs usable              
 [2021-04-07 18:42:27] All devices disabled, cannot mine!

What?!? clinfo tells me I have a usable device! I have an AMD Radeon R7 200 Series, which I am told should work for this – and I had it working with some ML libraries I was playing around with a few blog posts ago, so I know it can do mathy stuff.

I see in the cgminer readme it is recommended to use ‘mxe’ for windows – we will try that next time.

Bitcoin 2

Okay, so it turns out you don’t strictly need a gmail account to use this bitcoin.com app – that’s only if you want to back things up with Google, which we don’t care about yet. One thing at a time here! The first order of business is to tell Slush Pool where our wallet is. For this we need a 34 character wallet address, which is a little tricky to find in the app. I found it by tapping on “My BTC wallet” and then tapping on “RECEIVE”. This shows you a QR code. Tapping on the QR code will copy the address into your clipboard. I stumbled on this completely by accident, I am not sure what the big secret is but there you have it.

It’s not completely obvious how to get this address into Slush Pool, either. I went into Settings and then into “BTC Payouts” (under “Bitcoin”). There is a field in there called “Payout Address”, and if you click the pencil you can paste in your wallet address. Don’t forget to hit “Submit” at the bottom of the page (I definitely didn’t forget to do this the first time around).

Now Slush Pool knows where to put our money so we can get rich! The next step is to install CGMiner on our garbagey old mac.

Installing CGMiner on our garbagey old mac

I won’t go into too much detail here because it’s all pretty easy to find on the internet – I will just tell you about the speedbumps I hit. The first one is when I ran ./configure for the first time, and received a complaint about libcurl dev not being installed.

Installing curl was kind of mysterious so I went with the homebrew approach. Homebrew is easy to install and apparently when you type ‘brew install curl’ it installs both curl and libcurl dev. So that’s convenient. It still doesn’t work though! You have to also throw these in:

export LDFLAGS="-L/usr/local/opt/curl/lib"
export CPPFLAGS="-I/usr/local/opt/curl/include"
export PKG_CONFIG_PATH="/usr/local/opt/curl/lib/pkgconfig"

./configure gets a lot farther now, but craps out with this message:

configure: error: No mining devices configured in

Ah, we have to configure a mining device like so:

./configure --enable-avalon

But which one do we choose? There’s a whole list of them, and I have no idea what any of this means. I do know one thing, actually – we can get rid of anything that has ASIC in the name, because we don’t have one of those. That leaves:

  DragonMint.T1.Halong.: Disabled
  Antminer.S1.Bitmain..: Disabled
  Antminer.S2.Bitmain..: Disabled
  Antminer.S3.Bitmain..: Disabled
  BitForce.FPGAs.......: Disabled
  Drillbit.BitFury.....: Disabled
  Icarus.ASICs/FPGAs...: Disabled
  ModMiner.FPGAs.......: Disabled

Hmmm, a lot of these are FPGA’s.

What’s an FPGA?

Field Programmable Gate Array. The internet makes this sound like something you have to build. If I had one of these, I’d probably know it. So we can get rid of those. The others, it turns out, are all ASIC’s. So we’re screwed.

We can go back to the last version of cgminer that supports GPU mining, which, I think, is something we can do on these crappy computers. We will almost certainly be losing money by doing this, but it will be on the scale of pennies, so we are going to plow ahead out of morbid curiosity.

Getting this version of cgminer to compile involves some tinkering. I used homebrew to install autoconf, automake, and libtool, and then had to run autoreconf -i. From there I appear to have only one choice of mining device, so I configure with ./configure --enable-opencl. Then I run make, and uh-oh:

util.c:1001:9: error: implicit declaration of function 'clock_nanosleep' is
      invalid in C99 [-Werror,-Wimplicit-function-declaration]
                ret = clock_nanosleep(CLOCK_MONOTONIC, TIMER_ABSTIME, ts...
                      ^
util.c:1001:42: error: use of undeclared identifier 'TIMER_ABSTIME'
                ret = clock_nanosleep(CLOCK_MONOTONIC, TIMER_ABSTIME, ts...
                                                       ^
2 errors generated.

Apparently this clock_nanosleep just isn’t implemented in macos, and this is suddenly way above my pay grade. Let’s try linux! I have an (even older) mac that has ubuntu 20.04 installed on it. Getting the libcurl dev stuff is a little different, I have to do an apt install libcurl4-openssl-dev, but the rest is pretty much the same and it gets me past this clock_nanosleep thing.

Installing cgminer on our garbagey old linux box

This machine has an NVIDIA graphics card in it, and I have to install an OpenCL compatible driver from NVIDIA to get this to work. Fortunately NVIDIA has done a pretty good job of providing this! This finally allows me to make and make install, and I think we are ready to go!

Oops not quite. We also have to install the cuda stuff from NVIDIA. This doesn’t go as well for me. We are throwing good time after bad on this.

Installing cgminer on our garbagey old windows box

Stay tuned for the next episode!

Bitcoin

I have decided to take a little break from the ledger, and try Bitcoin mining! Maybe I can make some spare cash with my free time. Or not, who knows? Only one way to find out.

I know next to nothing about Bitcoin, so I am going to post as I learn, and you can learn along with me. I’m going to assume that you have some idea of what Bitcoin is, and maybe you understand that blockchain has something to do with it and there’s a lot of math. So let’s get on with it!

What is Bitcoin Mining?

Bitcoin Mining is, I am told, how Bitcoins are created. There is apparently a lot of math involved in creating a Bitcoin – something about blockchains, probably – and it takes a computer a long time. Maybe ten days? Maybe ten years, on the old clunkers I’ve got laying around that I plan to use for this.

Every time you successfully create a Bitcoin, you get some reward (in Bitcoin, naturally) for donating your processing power to the cause. This is how the money is made.

What do I need?

I have two computers that I am going to attempt this on. One is a mac, the other windows. I am not going to go into too much detail about them just yet, since this may not work at all. But I’m going to start with the mac since that is more Bitcoin-friendly as far as I can tell.

For the software, a rudimentary online search suggests that CGMiner might be an okay place to start. Their documentation says that I can start mining like this:

Single pool:

cgminer -o http://pool:port -u username -p password

What’s a pool?

Okay, apparently it really will take years if I attempt this myself. So I can join a “pool”, where I’ll get a little slice of Bitcoin according to how much processing power I can chip in (hint: not much). But that’s better than waiting a few years and maybe ending up with nothing at all.

Which pool should I join?

Another rudimentary online search will provide the names of many popular pools – Poolin, F2Pool, Antpool, Slush Pool. These are all big popular ones, all based in China except for Slush Pool which as far as I can tell is in the Czech Republic. Is there a way to “Buy American” on this? All the stuff I’ve read says that it is better for Bitcoin in general to be decentralized and not have a few big players dominating the market.

Well, Marathon and Titan seem to have “North American Pools”, but it looks to me like they are geared toward high-end corporate clients. Since I’m just a shmoe and I don’t know what I’m doing, I’m going to start with Slush Pool, since they are the “original”. Signup is a snap and they have a jazzy interface, so let’s do it.

What’s an ASIC?

Uh oh, Slush Pool is telling me I need an ‘ASIC’ and that my old garbage computers won’t do very well at this. An ASIC is a specialized computer just for Bitcoin mining, and the good ones cost thousands of dollars! This is starting to sound like work. Let’s see if I can get the old garbage computers working, and make a few pennies. Then I can work my way up to the Lamborghini ASIC.

Okay, so Slush Pool has given me a URL, a username and a password. That’s all I need to run CGMiner! Wait, I need a ‘wallet’ to collect my winnings!

What’s a wallet?

That’s where you keep your Bitcoin, apparently. Slush Pool is telling me to use Trezor Hardware Wallet, and I have no idea what I’m doing, so that sounds great! One less decision to make.

Wait, what? The Trezor is a physical thing I have to buy! It’s only 60 bucks, so I could buy this thing, but isn’t there a digital option?

So I go to bitcoin.org and go through their lengthy and detailed wallet selector, and wouldn’t you know it, they recommend… their own digital wallet app! Okay, now I have a digital option and I don’t have to think. I am installing it.

Ack, there’s two of them! The one that bitcoin.org recommends has a lower rating than the other one, which is from bitcoin.com. And they both have about 5k reviews. So I’m going with the bitcoin.com one. What do I know? Worst that can happen is I lose $3 of Bitcoin. Once I get one of those fancy Trezor things I won’t need this app anyway.

Aah, fudge. It will only let me sign in with a Google account! I don’t want to use my Google account for this. I will create a new one and give it a crazy long password, and use my original Google account for recovery. That should do it.

That is for tomorrow, as it is now my bedtime. But I think I am off to a good start.