Old Core wallet to Knots

Hi, I found this forum trying to find people that use Knots. I hope someone knows how to do this.

If there is an old 2013 Bitcoin Core wallet.dat which was last used a in 2015 version of Core and you want to run Knots, given that the wallet is an old format, what should you do to not screw up in the process?

It appears there are 2 options, “Migrate wallet” and “Restore wallet”. Not sure what they do and in what order that must be done. Of course backups do exist, but I want to know what is going on.

The end goal is to have a setup like this:

  1. Laptop 1: It runs Bitcoin Knots full node, with the full blockchain synced, used to broadcast transactions and be able to monitor the wallet in watch-only mode.
  2. Laptop 2: It runs Bitcoin Knots as a wallet, to craft transactions, never connecting to the internet, and then send this transaction into the Laptop 1 node, and broadcast it.

I think back then you had to use the console with some convoluted raw transaction commands that you would risk you type the wrong comma at the wrong spot or copy paste the wrong address and screw something up. I think right now there is the PSBT thing that is integrated within the GUI. Ideally, I could use Coin Control, select what addresses I want to use to send the coins, and then generate a PSBT file that contains this information, and then you load this file into the node on online laptop to broadcast the transaction. I have never done this, so I will test first with testnet coins. But hopefully someone knows how to do it properly here.

But for now, let’s first solve how to port the old wallet file into the new wallet format without screwing things up. Then we can later worry about how to do the 2 laptop setup to transact. Meanwhile I will continue to sync the node which is taking like 4 days, so im posting this here so meanwhile I can learn how to do the next steps.

I am using Debian 13 btw, on both laptops. I have to install it on the cold storage laptop, where I will load the wallet, this is the offline wallet. Then I will need to learn how to generate a watch-only wallet for the online laptop. Then I need to learn the PSBT thing to make the transactions. It’s a lot of work and I don’t want to create a separate thread for each thing, so maybe we can solve this here? I understand things better if they are step by step, ideally in video but I didn’t found them, so im asking here. The manuals are not enough, I would like to talk to someone that has actually done it and encountered and solved any problems in this proces.

Please let me know. Thanks

Disclaimer: I have not done this process exactly, so there will likely be inconsistencies to work through. I have done a more-or-less analgous process with Electrum and an old paper wallet so the broad strokes should be similar.

Backups & hygiene (most important)

  1. Make at least 2 copies of the original wallet.dat on separate media (e.g., two USB thumb drives).
  2. On Debian, run sha256sum wallet.dat and write the hash on paper; verify each copy matches.
  3. Do not reuse the original file; always work on a copy. Never open the same wallet in two places at once, where if something were to go wrong you would lose both copies.

Prepare the two laptops

Laptop 2 (offline signer):

  • Physically remove WiFi and Bluetooth radios if possible. Physically disconnect any network cable. Once powered up into Debian, run: rfkill block all.
  • Copy over Knots via USB after verifying the signatures.
  • Create a private wallet directory (permissions 700), e.g. ~/knots_wallets/legacy/, and put a copy of wallet.dat inside as wallet.dat.

Laptop 1 (online full node):

  • Run Knots as a full, non-pruned node (already done – just finish the IBD).
  • You’ll import a watch-only wallet here later.

Safest way to load and migrate the old wallet (offline only)

Knots should be able to open old Berkeley DB wallets as “legacy” wallets. The modern standard is a descriptor wallet. Migrating is one-way, which should be fine here since we are ensuring that we have backups.

On Laptop 2 (offline):

  1. Start Knots with your legacy wallet (GUI: “Open wallet” and point at the folder, or CLI with -walletdir pointing to ~/knots_wallets).
  2. Take another fresh backup from Knots (“Backup wallet”) so you have a post-open snapshot (just being paranoid here).
  3. Upgrade to descriptor (this might be labeled as “Migrate wallet” in the GUI; via RPC it’s upgradewallet). This creates internal descriptors for your keys/addresses. Note that I have not done this myself, so I am relying on various search results (there is some discrepency on the label – maybe due to different Knots/Core versions). Let me know if you can’t find anything that seems relevant here, and I’ll do some more digging.
  4. After upgrade, export public descriptors (GUI export if present, or RPC):
  • listdescriptors false > exports public (no private) descriptors, including active and internal (change) ones, with useful timestamps.

Save those descriptors to a text file on a clean USB (this file is safe to move – no secrets if you used false for listdescriptors).


Build the online watch-only wallet

On Laptop 1 (online):

  1. Create a new empty wallet (GUI: “Create wallet” > disable private keys / “watch-only”).
  2. Import the public descriptors you exported from offline:
  • GUI import (if present), or RPC importdescriptors with the JSON from your file.
  • Make sure the active, internal, range (for HD paths), and timestamp fields are preserved so Knots rescans correctly.
  1. Let it rescan the chain (can take a while). When done, balances and UTXOs should show as watch-only.

Sanity check:

  • On the offline wallet, generate a receive address and write it down.
  • On the online watch-only wallet, derive/show a receive address. It should match. (Matching here proves the descriptors line up.)

PSBT flow

Create (online):

  • In the online watch-only wallet, enable Coin Control. Pick the UTXOs, set outputs, and choose “Create PSBT” (GUI) or walletcreatefundedpsbt (RPC). Save the .psbt to USB.

Sign (offline):

  • On the offline wallet, open/sign the PSBT (GUI: “Load PSBT” > “Sign”; RPC: walletprocesspsbt). Save the partially or fully signed PSBT back to USB.

Broadcast (online):

  • On the online node, Finalize & Broadcast (GUI) or finalizepsbt then sendrawtransaction with the hex.
  • Verify the txid and watch it confirm.

Extra safety tips

  • Never plug the offline box into the internet, until you have drained the walllet and moved everything to a new home, and there have been many confirmations.
  • Label and keep your USBs separated: “PUBLIC (descriptors/PSBTs)” vs “PRIVATE (backups)”.
1 Like

Hi, thanks for the very nice tutorial, I think I will be able to follow, but before trying any transactions, I will first try with a testnet wallet. I will need to see how to get testnet coins nowadays.

But for now, the node continues to sync. For some reason, at about 1 year and 25 weeks remaining or so, the download activity becomes very sparse, with gaps between activity. Right now I saw an huge 10 minute gap of no activity, then it downloaded a bit, then again is pausing. I was like 22 hours away from completing the sync, but right now it may take like 4 days… I don’t know why that happens.

I also had some checksum problems with some block files a couple times, that seems to be resolved by using -par=1. I think the CPU was overheating or who knows. Im using dbcache=2048 which I think is reasonable for 8GB ram. It is an external SSD with FDE enabled, plugged on a 3.0 USB port. The SSD drive is of a good quality brand and a relatively new model. The laptop is however 10+ years old, running an i5. Nonetheless, I don’t understand why at around 73% it’s when it started becoming really slow taking big breaks between activity. Hopefully this is not the result of non-monetary usages of the blockchain where the computer has to spend more time dealing with more complex signatures when you start downloading certain blocks or something. In any case, I will leave this running and hopefully it finishes this week. I hope this 2TB drive lasts before the blockchain fills it up because doing this from scratch is increasingly more time consuming when I have done it as the blockchain increases in size and complexity. I don’t trust using pruned more or assumevalid=1 so I do it the oldschool way only, meaning that I need to host the entire blockchain and fully check every block. Anyway thanks again for the tutorial, I will try whenever it’s done.

It is exactly this. With 8GB RAM it will need to do a lot of file swapping (which is relatively slow even over USB 3.0). 8GB RAM is fine for normal everyday use, but the IBD is brutal. If you have any option to add another 8GB RAM, the difference would be substantial (otherwise it is just a waiting game).

1 Like

Wow, so this showcases the importance of making Knots the #1 node and stopping the spam transactions. Most people that think non-monetary transactions are ok probably have never synced a full node in their lives, specially if you do it properly without any shortcuts like assumevalid or whatever. I think everyone should host a copy of the entire blockchain and anything that makes this more difficult is to be avoided. I think the full node software should give a warning about using assumevalid even, since if we keep promoting this pruning thing then at some point copies of the entire blockchain will be a relic and the whole point of Bitcoin is being able to trace everything back to the genesis block. So you don’t want to end up with a few full blockchains stuck on some old hdd’s because people stopped caring about maintaining them in the future.

Anyway, I will let it run and hopefully this ends before xmas.

1 Like

Woke up with an speed of 0.02%, it will take like 8 weeks to finish, not normal at all. I don’t think that even with all the spam in the world it should take this long. Something going on with the computer probably. I may need to give up on the laptop and try to plug the SSD in the desktop and then hope it works as it would be plugging it into a different hardware, and also hope that it downloads faster. The desktop is much newer. Still would like to be able to run this in a laptop, but it will never finish as this rate.

It should work on another computer as long as the OS is the same (or supports the same format). And when finished, you should be able to use it again back on the laptop. Does the desktop have more RAM?

The node runs in an external SSD within an encrypted Debian installation so everything should stay the same except the drivers. I don’t know if Debian will simply find the drivers for the different hardware and run normally or it will give me a black screen of sorts due lacking drivers for the GPU or something. I guess there is only one way to find out. I remember trying from desktop install to the laptop and it seemed to work so I will try the other way around now and see what happens. Hopefully it works because the point of using an external SSD was to be able to have a portable node that you can connect on any computer on the go if needed. If you have different laptops in different spots or different desktops you could swap by simply carrying the drive.

But I did turn off the laptop for some hours and the speed improved, it was downloading at 2% per hour at some point with 10 hours remaining, but then it started going slow again and right now it’s at 0.3% with 2 days remaining. I will try to plug it on the desktop at night since im using it now. Yes the desktop is much more powerful with 32GB RAM and 16 cores. What dbcache and par values would you use? I’ve read that a third of the RAM is recommended, not sure about par. Im using par=2 right now on the laptop and it didn’t crash yet (it is a 4 core i5).

Looking at the size of UTXO chart in statoshi website this may explain why it gets worse around this time where it peaked at around 12GB. It’s down to 11.6GB now. Not sure if due peak of interest in Inscriptions stuff or the Knots anti spam nodes but hopefully it goes back to a more manageable size. But there was something abnormal going on when it was downloading at like 0.02%. I think the hardware was fried from hours of running with no rests or something, so at least right now it’s at 0.3%, sometimes higher but hopefully it doesn’t go back to mega slow progress. The temps did definitely go a bit lower and are now under 80ºC.

Hi, I finally tried with the desktop. The desktop did in an hour what the laptop was going to take forever. So this shows there was not a problem with the portable SSD. Thankfully it did boot and apparently Debian is able to download drivers on the fly during boot or something, the thing is fastfetch shows the correct hardware specifications for my desktop, and now im back to the laptop with the node fully synced. This may be a solution for those struggling with older laptops, using a portable SSD for the node. However, I do not enjoy dealing with Bitcoin things in non-Corebooteable computers, but it was just impossible to finish the sync at this rate, and there are no private keys or even watch only wallets on the drive yet, and I doubt because of pluging it into a modern AMD desktop computer for some hours to sync it would be able to end up with some exploit installed on the SSD. The AMD desktop has 32 CPU threads and 32GB RAM so even with the spam it pushed through with ease.

The node is fully synced and thankfully once synched it seems it stays at a reasonnable temp on the laptop and it’s manageable, There is a small but constant stream of activity on the network graph. So everything seems to be ok now on the node laptop.

Now it’s time to install Debian on the airgap setup. The problem is I don’t have a secondary laptop available right now so im not sure what to do.

Should I wait to buy a secondary laptop, or would you install Debian in another portable disk and use the desktop and the laptop? I was planning to buy a secondary laptop and install Coreboot there to manage the private keys. Unfortunately the laptop im using for the node cannot have Coreboot installed I think. But for now it’s what I’ve got.

The thing is, I use the Guided installer with FDE for Debian since I don’t know how to manually set up the correct partitions, and depending on the hardware you install it, the guided partition setup will use a different swap size for instance, so ideally I want to have the hardware im going to finally use before I install it. I could I guess use the laptop to instal it, since it has similar settings to your average Corebooteable laptop, and meanwhile try to find a laptop. Im tempted to use the desktop but im just paranoid at using modern computers in general for these things. Anyway let me know what would you do from here. Thanks.

To be a little pedantic, another thing that this shows is that 32GB RAM drastically reduces the amount of file IO necessary to complete the IBD (by orders of magnitude). In other words, there could technically still be an issue with the external drive itself related to file IO speed under load, which having the extra RAM would overshadow. Worth considering as a possiblilty in case it has future implications.

As far as the airgapped computer, I would suggest getting a separate dedicated system to run it on (you mentioned acquiring another laptop). Personally, in my case I use an old, low-spec, repurposed tiny PC (because they are robust, easy to work on, and I know how to physically remove the radios – a lot of laptops have the radios soldered into the motherboard and not possible to physically remove). But the drawback there is I also need a monitor, keyboard, and mouse, so they are less convenient than setting up an air-gapped laptop like you are doing.

Also, for the related process that I mentioned (moving funds from an old paper wallet to a more secure cold storage solution), I didn’t use Debian or Core/Knots, but rather in my case I used Tails OS and Electrum. So the context is different in your case so you have different constraints and risk analysis to work through.

This laptop has compatibility with 16GB so I may upgrade. As far as using Electrum, yes I want to only use 2 Knot instances. Using different software increases complexity (not necessarily in the method, but just more variables).

I need to do this with testnet coins first. I forgot about how testnet works. I have to download it’s own testnet blockchain and get coins from some faucet? Anyone knows one?

And I also need to see how to do this through Tor.

Hi, could you explain how to do this? How do I check that. Do you mean open the .json file and what do I look for exactly?

I have fully installed now Knots, fully synced on the online node and Knots on what will be the airgap laptop.

I was considering installing Bitcoin Core from 2013 in order to simulate the entire thing with tesnet coins (by creating a 2013 wallet, getting some testnet coins, and then do the migration and the entire thing manually).

Is this overkill and im safe if I just use backups? or could I screw up during this process and lose funds?

Im just also not sure if I install Bitcoin Core 2013 in another folder it will screw up my /.bitcoin blockchain files which took like a week to setup.

Sure. Note that I have not done this process myself, so you’ll need to make adjustments as necessary.

From the offline box, you should have gotten something like:

{
  "wallet_name": "cold_migrated",
  "descriptors": [
    {
      "desc": "wpkh([abcd1234/84h/0h/0h]xpub.../0/*)#u5t4kl9d",
      "active": true,
      "timestamp": 1388534400,
      "range": [0, 999],
      "internal": false
    },
    {
      "desc": "wpkh([abcd1234/84h/0h/0h]xpub.../1/*)#p8d0yq56",
      "active": true,
      "timestamp": 1388534400,
      "range": [0, 999],
      "internal": true
    }
  ]
}

The importdescriptors RPC expects an array of objects with just the needed fields:

[
  {
    "desc": "wpkh([abcd1234/84h/0h/0h]xpub.../0/*)#u5t4kl9d",
    "active": true,
    "internal": false,
    "timestamp": 1388534400,
    "range": [0, 999],
    "label": "external"
  },
  {
    "desc": "wpkh([abcd1234/84h/0h/0h]xpub.../1/*)#p8d0yq56",
    "active": true,
    "internal": true,
    "timestamp": 1388534400,
    "range": [0, 999],
    "label": "change"
  }
]

Save the modified json file. The following command should make the above-mentioned changes and save it to a new json file (but sanity-check the output to be sure it looks fine):

jq '.descriptors
    | map({desc, active, internal: (.internal // false), timestamp, range, label: (if (.internal // false) then "change" else "external" end)})' \
  descriptors_public.json > import.json

Now you can import the file. If you saved it to the current folder as “import.json”:

# Ensure the correct wallet context:
bitcoin-cli -rpcwallet=watchwallet getwalletinfo

# Import descriptors from file using stdin (avoids shell-quoting mess):
cat ./import.json | bitcoin-cli -stdin -rpcwallet=watchwallet importdescriptors

Now you can run the verifications that I mentioned:

active/internal flags are set correctly

Run:

listdescriptors false

In the output, confirm you have exactly two active: true descriptors (one with "internal": false, one with "internal": true). The strings should match what you imported.

range is preserved

In the same listdescriptors output, check each ranged descriptor has the range you set (e.g., [0,999]) and a reasonable "next" index (Knots tracks how far it has derived). If your old wallet used many addresses, choose a range big enough to cover them; you can widen it and re-import if needed.

timestamp is correct (rescan starts far enough back)

Immediately after import, run:

getwalletinfo

Look at the "scanning" object for progress. If you accidentally set "timestamp": "now" you won’t get a historical rescan (you may see balances missing). Fix by re-importing with an older timestamp (or run rescanblockchain back to your coin’s birth height).

Watch-only status

  • Check the wallet is descriptor-based and watch-only:
getwalletinfo

Look for:

  • "descriptors": true
  • "private_keys_enabled": false (or "private_keys_disabled": true depending on version)
  • Pick a known address from your offline wallet and verify on the online wallet:
getaddressinfo "bc1q....."

Ensure "iswatchonly": true and "solvable": true.

1 Like

I don’t think im following about the modification of the .json file. So when you use “listdescriptors false” to import the descriptors for watch-only wallet you get the data in the wrong order and you have to manually modify it so you can import it properly? But why does this happen? I don’t get it, why would one need to do that. Shouldn’t it just give you the right info, save it as .json and import it on the watch-only wallet? Or is this only if you use RPC? I don’t use RPC, I just have the 2 laptops running and that’s it, I never needed RPC, so I think im getting confused with these extra steps.

I don’t understand these things. And why do you need to screw around with manually editing a json file? looks like I do something wrong and recipe for disaster guaranteed. There’s too many addresses to check one by one.

I also do not understand this. I have wallet 1 (offline) and wallet 2 (online). Wallet 2 contains no private keys as we know, just watch-only stuff as we imported the .json file that was created by copypasting the output of “listdescriptors false”. So my question is, I want to create a new receiving address… I suppose I need to create new receiving addresses on the offline wallet. So what do I do with this? The online wallet will not have the public key corresponding to this private key. Or does this work? When you do “migrate” on the 2013 wallet, then it gets converted into HD and then you do “listdescriptors false” and something on this .json file that you import into the new empty watch-only wallet contains something that somehow links it with the offline wallet? It derives public keys in the same order as you are generating private keys on the offline wallet and they always match in order? This is so complicated to understand for someone that does not know the bits and bolts of cryptography and how this works. Since it’s airgaped, and does not communicate with the online wallet, then I don’t understand how this works. I guess it’s some sort of cryptography magic that has to do with key derivation that im missing. But for it to work I assume you must do the “migrate” thing otherwise then each private key is independent from others and not derivable. I need to understand the ins and outs of this to get it.

Yes, definitely take the time to study this and understand the different components before proceeding. If you have the ability to test this process out before attempting on the original, definitely do that.

The reason for editing the JSON file is because the output follows this pattern:

{
  "wallet_name": "cold_migrated",
  "descriptors": [
... STUFF ...
  ]
}

And what the importdescriptors expects is just this:

[
... STUFF ...
]

So the goal is to grab the “descriptors” aray from the output, and use that for the import. You can do that by grabbing that part and pasting it into a new JSON file (or by deleting the surrounding stuff from the original JSON file, leaving only the array).

After that, in my example, I also added a property “label” to each element in the descriptor array, but that may not be necessary (you can test the import without it, if there are too many elements for you to label)

Once you have the new/edited JSON file, then import it as I mentioned with:

cat ./import.json | bitcoin-cli -stdin -rpcwallet=watchwallet importdescriptors

You don’t necessarily have to run those steps for “Make sure the active , internal , range (for HD paths), and timestamp fields are preserved so Knots rescans correctly”, but I would still recommend taking some time understanding how to do various sanity checks to make sure things are going properly during the process. You could probably use a LLM to help you out or to see if there are easier ways to run some of these setps in the UI if you are not comfortable around the command line. Or it may suggest a simpler way to acomplish your goal than the rough outline of a process that I etched out here.

But I don’t understand the behavior of this export->import workflow.
In any normal software, you export a file, and then import it. That’s it, you don’t go into the exported file and manually tweak something. So why do you need to do this exactly? Shouldn’t it just work?

Why isn’t there a way to actually export a .json file ready to be imported into the watch-only wallet to begin with? Looks like this is badly implemented. It shouldn’t be this complicated to get this done. I understand opening the file and checking that everything is ok, but what I mean is, it should just work by default.

For sure, give it a try – it may work just fine. Note that I am suggesting a process that I have not run myself, based solely on my understanding of the spec. My understanding is that the JSON structure which is exported via listdescriptors does not exactly match the structure expected by importdescriptors (the import expects just an array, while the export has that array rapped in a parent object). I could very well be completely misunderstanding spec, though (please do not take anything I wrote as having any more weight than a rough outline of a process that requires refinement). Give the direct import a try, and if it takes, then that saves you some headache. If the import step throws an error, then see if making the edits I mentioned fixes the error.

You mean that I should try to copy paste it like this?

And see what happens?

Thing is, I would do it, but im paranoid to do this with actual wallets, so I wanted to do this with testnet coins first, even tho this way I would skip to see if the migration of the wallet is successful (since the testnet wallet is modern). I want to see if the migration of the 2013 wallet works, but I do not want to download an old Bitcoin Core version to create an old Bitcoin testnet wallet wallet, get some coins and migrate it… too much work tbh.

Do you know what would be the best sanity check to see if the migration of the cold wallet is successful?

I suggest three things (increasing difficulty – stop once it imports without errors):

  1. Import the JSON file unmodified
  2. Cut out just the array (between the square brackets like you have in your last post) into a new JSON file and import that
  3. Take the new JSON file from step 2 and add the “label” property to each element in the array (what I origionally suggested earlier in the thread)

The best way to sanity check (besides the checks that I mentioned earlier) is to generate a new address from both the airgapped computer and from the online computer and make sure they match. And of course any balances should be visible from the watch-only wallet (meaning you will want to test it with some bitcoin, not just with empty wallets). I would use the testnet to work out the process, and then run a live test on mainnet with a very small amount of sats, to make sure everything migrates, shows up, and can be moved, before commiting your real stack.

But to do this whole thing in the testnet I would need to download whatever version I used to create the wallet file. And it wouldn’t still be 1:1 since I updated Bitcoin Core a number of times, so im not sure what version of the wallet.dat file I have since Bitcoin Core may have updated the wallet since the first one… Is it possible to know exactly what version of the wallet.dat file you are using so I could download that exact Bitcoin Core version and go from there?

But still, I would need to download the blockchain AGAIN for Bitcoin Core old version as well as for the testnet… this is a nightmare, it took ages to do this. And you would be doing this with an old version, and older versions I remember being slower for some reason, and you will have to deal with the more complex spam blocks with an older version… I don’t know about that.

I don’t want to use the same .bitcoin datadir folder for this because I don’t want to screw up my files by doing this, so that is why I would need to redownload and revalidate the entire thing. And these older versions do not even have assumevalid option to try to speed up the process (since it’s just for testnet purposes) so yeah, I would be validating the entire chain… should I bother doing this, or should I test with the real files doing backups? I mean as long as the funds do not disappear, it should be fine. I just want to see what happens with the migration wallet, and then try to do the watch-only wallet. I will not be sending funds. To practice sending funds with this method, I will use the testnet, which I already synced. It’s the replication of migrating the wallet that would take ages.