Jump to content

Recommended Posts

6 hours ago, mDuo13 said:

Update your validator settings!

To use a centrally issued list instead of a manually curated one?

Consensus can only work in a guaranteed way with at least 80% honest nodes in your local UNL and also 80% global overlap of individual UNLs. How does the decentralization strategy ensure this property or even detect potential failures?

Share this post


Link to post
Share on other sites
11 hours ago, Sukrim said:

To use a centrally issued list instead of a manually curated one?

Consensus can only work in a guaranteed way with at least 80% honest nodes in your local UNL and also 80% global overlap of individual UNLs. How does the decentralization strategy ensure this property or even detect potential failures?

How about a professionally curated list? Because that's what vl.ripple.com is.

For now, everyone using Ripple's curated list has 100% overlap with everyone else using it, so that solves the overlap issue for now. In a future with multiple different validator lists run by different parties, making sure that the lists have sufficient overlap is going to be a key challenge. We have some other plans to improve on that, which unfortunately I don't think I can talk about. Maybe @justmoon would be willing to spill just a few beans?

As for the honesty and non-colluding properties of validators, that's what Ripple's oversight of the validator list is about. It's partly about the kinds of institutions we hope to bring on board—universities and charities and other organizations working for the public benefit, in addition to businesses who might have a stake in the continued proper behavior of the XRP Ledger, like exchanges, banks, money transmitters, and even web hosting companies.

 

To recap the Decentralization Strategy, here's a summary:

  1. Switch to using a validator list site (vl.ripple.com). This is where we are now.
    • All rippled instances configured to use the site can automatically follow Ripple's updates to the recommended set of validators, in lockstep.
    • In case you're curious, the validator list site publishes cryptographically signed recommendations of validators, so it's not easy to impersonate. And rippled caches the data it gets from the site, so the XRP Ledger won't go down even if vl.ripple.com is down for a while. (It might be tough to bring new rippled servers online while vl.ripple.com is down, but I think there are some protections against that, too.)
  2. Update the site and the existing validators to use validator tokens instead of master validator secret keys.
    • This adds security to the existing validators. By using tokens, that Ripple can keep the master validator keys offline and periodically rotate the tokens, for example if an operations engineer who might've had access to the config files leaves the company.
  3. Update the site to add 16 new Ripple-controlled validators to the existing 5.
    • The main reason for this is so that any new individual validator isn't too large a slice of the pie.
  4. Add new third-party validators. For every two third-party validators, Ripple will remove a Ripple-controlled validator from the recommended list.
    • This will probably occur gradually over the course of 2018 and beyond.
  5. Eventually, as the size of the network has grown, Ripple will encourage others to run validator list sites similar to vl.ripple.com. As long as the lists published on the different sites have sufficient overlap, servers using any list won't fork away from each other.
  6. The "Secret Future Stuff" I alluded to, which may also occur before or as part of step 5.

Share this post


Link to post
Share on other sites

It seems there is no magic in the world, just hard work. Great stuff guys (even though this is not as reassuring as one might have thought after Stefans’ decentralization post in May 2017).

Share this post


Link to post
Share on other sites

Yeah, it's not documented in detail yet, in part because I haven't had time to dig into it and get all the details. But I can tell you a little bit.

The "blob" is a base64-encoded JSON string containing (for example) the following (you can verify this yourself by echoing and piping the blob content to base64 -d if you're on *nix):

{
    "sequence": 2,
    "expiration": 569980800,
    "validators": [
        {
            "validation_public_key": "ED6C9E8456FDA70144A73E709D5096463F1585F7158881F3BDC53E8B4FF1A1AB9B",
            "manifest": "JAAAAAFxIe1snoRW/acBRKc+cJ1QlkY/FYX3FYiB873FPotP8aGrm3MhA9mu8kAXgZh9Es8Jmi43UOspDNKvuaBRmgVnoAcF7dPTdkYwRAIgGjUNlb0/CMVKRZ/xsqyRU44Xp6eW49YE7STmISMMo38CIFzRps3YAVOEJjORUHm+5jPVkPUM5haZSlh+H5A6H1CqcBJA+rO3/EBajQxi+X03bv88XLaJZ29JGHmp9KnDsw2yLjM1raGOVSCmW6aCshze4E0vjt0n3i9D3+0+6jEvJNdPAg=="
        },
        {
            "validation_public_key": "ED44FFE2F6249C37321A349C0A983C5D5D3EE334013AE4A5D8986E1674920354C9",
            "manifest": "JAAAAAFxIe1E/+L2JJw3Mho0nAqYPF1dPuM0ATrkpdiYbhZ0kgNUyXMhA2BsdqQcpySDEjiNfpWLH/SEReD0mKca2pBapPn8v21edkYwRAIgGfEX3KqbavKh+ULq7/uzY/pyXJsWKA9kMrWyd/S34wkCIDFQ7MHttljWbOMs18yfBIVNtyc714HHnj6L1aXOinCqcBJAkFsUkgaOACMfOgnLWl/kpTvAnY0mVH2r3cfSwZjsZyfrASiy1JCesixHOyAmypD9lCxEFESfWbNYXVSN/o3vDA=="
        },
        {
            "validation_public_key": "ED57EA43D51EA4D1C78929BB24090ED3C89F03E1ED72BC564E957D87273006544F",
            "manifest": "JAAAAAFxIe1X6kPVHqTRx4kpuyQJDtPInwPh7XK8Vk6VfYcnMAZUT3MhAygLFlHdFPSlbYNKy+ZjdkUDLYcdC9/z7AuDNaAh7sbCdkcwRQIhAMZ+PNIkFZXVQSzrv4rfQB4uSplW3/sim+P56c6+Vt8KAiAqX4+X+Meakzlgp00wN2mZ9cAS5jB4AFOfnIGvwWsl3HASQBhpRx9RGBhgRKNYybcXN95LF80g44DM6nHSAmJGfjVJBWaZoYfiJmK6akQaBnN4Q1NIMMJevPjXx7tPmv8GRwE="
        },
        {
            "validation_public_key": "EDAFA8C68121A5A7DCF8BE55BD68D56DD342B834289961CA39BE6F2D06B9F1E605",
            "manifest": "JAAAAAFxIe2vqMaBIaWn3Pi+Vb1o1W3TQrg0KJlhyjm+by0GufHmBXMhAqyqCmq4xrrWSV31jBpa25vDBUMEdD3upfaLa1VgzNFedkcwRQIhALZagRDPKZD7tYsglEc/oJkMsyjOcTPKuPTqWtStiQ6qAiBxJ4zpSq8LYxM7v+aVssw9LfGUDCACM+MNrAVT5YtG3XASQInSqzLnlR8E0anedpwFBiroXApOZZgtv94nh9WHxUlWhj5PZnJUW86RavArYCphz/TmK53EsNjEOz+KfuHGFgg="
        },
        {
            "validation_public_key": "ED998FE668B05429A4790412BED375B219F3BBF5ADCF21765879511A1657223454",
            "manifest": "JAAAAAFxIe2Zj+ZosFQppHkEEr7TdbIZ87v1rc8hdlh5URoWVyI0VHMhAxMm4MZwm5jTP+w9WmJqZDDHYoE2Qiewf3Mq/nBmg2CsdkYwRAIgaaQHafZHdae/s/Kmc8w7c2fJWA+uc9EpPG9dWABhUcwCIFeX9ASCzOOv5bQV42fcldG9X+22OIo02XXE1QD54PnWcBJAiylOFqrifImqqPknGYj7EQirpOPXU8NigspfpCFPZYoEB5/xmg3E9uiYlZe25Q3RE1hiVLGdCIGmVZz9TcdAAw=="
        }
    ]
}

I think a "manifest" is basically a synonym for a validator token (or the public part of it, anyway?), which is a somewhat ephemeral key signed by the master key pair, plus a sequence number (so it can invalidate previous tokens). Something like that. @wilsonianb knows this stuff better than I do.

Share this post


Link to post
Share on other sites
13 hours ago, mDuo13 said:

We have some other plans to improve on that, which unfortunately I don't think I can talk about.

Thanks, guaranteeing UNL overlap is _the_ major challenge in your consensus algorithm and I have yet to see any public discussion about solutions.

13 hours ago, mDuo13 said:

It's partly about the kinds of institutions we hope to bring on board

I personally am very critical about the idea of targeting institutions instead of individuals for validators, especially since I see potential conflicts of interest and because in the end institutions still are consisting of individuals again (and even within an institution not everyone has the same rights usually, so actually only a probably tiny subset of a large institution is actually aware that this group is running a validator).

13 hours ago, mDuo13 said:

validator tokens instead of master validator secret keys

These are not documented to my knowledge by the way and require a special program that you only ship in the RPM file...

Quote

As long as the lists published on the different sites have sufficient overlap, servers using any list won't fork away from each other.

As long as the lists overlap enough AND contain enough honest, non-byzantine validators (which is nearly impossible to verify). Also the initial set of validators that Ripple uses and now will expand has to be taken into account as a given initial "canonical" set. I am also not convinced that it would be in the best interest of list publishers to have high overlap - how the hell should I as a list publisher be specialized in making sure to recommend unique nodes while also having to respect a global(!) UNL overlap in areas and jurisdictions that I have zero stake or expertise in? I would feel confident to generate a localized UNL, e.g. because I know my country or city well, but I have no way of making sure that a Chinese and a Mexican validator are not colluding. Yet I risk putting my subscribers at risk of creating a local fork if I don't put in 80% of the other lists too.

Share this post


Link to post
Share on other sites

×