Jump to content


  • Content count

  • Joined

  • Last visited


About Sukrim

  • Rank
    Advanced Member
  1. Higher Transaction Costs

    Validators should not directly connect to any server not under your direct control, ideally physical control. This includes other validators. If your validator is directly reachable on the internet I wouldn't want it in my UNL.

    I kinda doubt that these are the "leading company" in the field at all and it also in the end is just about sending network packets. This is a few layers below applications to handle and if their technology requires user facing software to be changed, they are on a wrong path.
  3. It is very hard for third parties to assess whether a validation not reaching a validator in time was a networking issue or if it was omitted on purpose. You'd need to do this even for every round of Consensus, so more often than each ledger close (as by definition all validators will end up at the same ledger eventually or they will stop validating). In my opinion a more realistic scenario would be that some valdators are legally not allowed to allow certain transactions (e.g. US based validators enabling settlements between Iranian banks) and must always vote against these, but still are allowed to be overruled by the network as a whole. This is far down the line though, since there are probably less than 20 people worldwide that would be able to even start adding something like this to rippled, much less actually succeeding in this. On the other hand, they would likely be far more upfront about their validation policies ("I immediately validate every valid transaction" as it seems to be the current policy is NOT the only one possible unfortunately and at least for financial institutions currently also not the one that is in use). This makes it much harder for attestors though, because you'd need to know every policy and need to be able to assess every transaction whether it would be legal under various policies.
  4. Certificate and key revocation/rotation is very often a problem of discoverability - if some key would still be valid according to itself, but invalid according to some message signed by it later, it depends on you having received that message later. RCL could help as a "single source of truth" device here, but I'm not so sure if it is the right approach. For this concrete problem I would go the "let's encrypt" way and make keys as short lived as possible (1-3 months) and make sure creating, issuing and distributing new ones is as painless as possible. Maybe also look into security hardware like YubiKey4 devices, which are one of the few ones out there that support 4096bit PGP keys. It is very hard to have 100% proof that you followed an UNL you claimed to have set. It is possible to have validators for example publish a signed copy of the UNL they claim to follow, but you'd need to watch the network for disagreement to determine if the validator then really did vote as the UNL would dictate (e.g. more than half of the validators on the UNL have voted for A, yet the validator STILL voted B in the next round of Consensus instead of also voting A as the algorithm would have demanded). This still doesn't give you good proof (maybe the validator did not get the validations of the others voting A in time or at all for whatever reason and thus did really believe that B was a valid majority). It might be possible to make this a bit more complicated by publishing all validations you took into consideration at each round ("I vote A because I saw v1, v2 and v3 vote A but only v4 and v5 vote B and haven't seen v6 and v7"), but then it might start to get into performance critical territory, depending on how fast the crypto used is going to be and how much extra data is being produced/transmitted. [Edit: You also simply could lie about not having received validations to still vote as you wanted while looking "legit" to others - and of course the issue of "play by the rules 100% of the time ontil you don't" scenario is also not covered] There are other factors, like the example of "It would be great to have a validator in North and South Korea each because they don'T trust each other and thus keep each other honest" - which in reality would make it VERY hard for financial institutions to select a good UNL exactly because the best nodes to have on there are from countries and regions that your own government despises and probably even tries to wage a war against or at least forbids you from supporting or interacting with in any way. This means that you would give an enemy of the state recognition and power on deciding if you are going to authorize som transfer of funds. Having publicly visible and maybe even proven UNLs and validations would make it very hard to select a really good UNL in the sense of having a very diverse one, since oyu also give importance and power to everyone on there. Lastly you will want some trust agility (because you want the validation network to grow), but on the other hand you don't want it to be extremely agile (because what if everyone uses the same attestation service or implementation and suddenly UNLs are shifting with every single ledger close?). It is extremely hard to come up with sensible values for such systems that don't also potentially lead to very unforseen consequences (say only an additional validator per year is allowed on your UNL, or one per day or one per second) or that further down the line lead to cementing in some behaviour that you don't want to see (e.g. people not rotating their PGP keys at all because their old one was so well trusted).
  5. Higher Transaction Costs

    The size of the AccountState might also have something to do with the capacity, if you transfer XRP between the only 2 accounts on the ledger with the simplest possible transaction type your maximum TX/s might look very different than with half a gig or more of state that you need to query first for nearly everything. In the past, the AccountState set was much smaller.
  6. I can watch most movie titles online and the main reason why this is hard in general is a fragmented legal framework. I disagree with nearly all of your points, but I really don't like this discussion so I guess we'll just have to wait and see.
  7. Higher Transaction Costs

    We don't know the capacity of the network it very apparently has recently reached its capacity, at least the 5 core validators have.
  8. Higher Transaction Costs

    If you are DOSing the ledger, then fees are designed to be unlimited and rising exponentially. This is intended and documented behavior. Also fees are always exactly 0 USD. They are calculated in XRP exclusively which have a guaranteed value of 0 USD a piece. If XRP's current market value is above that, this is all nice and great, but that doesn't mean fees are intended to be any value measured in USD. The exist to make it hard to spam the ledger and the script published by ripple-lib devs does just this (spamming the ledger) including 0 safeguards for the expected fee spike. I would guess the transaction wouldn't make it even through evaluation on the server, to keep load down. This means the server should immediately reject the transaction without even looking at the contents and not broadcast it or do anything else. The fee also pays for evaluations on server side, so some failed transactions still make it on the ledger that way. If the server thinks it is not even useful to see what the outcome of a TX would be since the fee is too low, it shouldn't end up on the ledger and thus cost no XRP. No idea how it works in reality, could also be an isssue there that transactions with insufficient fees still make it onto the ledger or to other servers that accept them.
  9. Higher Transaction Costs

    But the fee levels ARE working as intended, their validators are just under higher load as expected. It doesn't help that lots of their Javascript devs don't seem to have a deep understanding of RCL in the first place, publishing a script with 0 fee caps that can dump LOTS of OfferCancel transactions at once is extremely close to being indistinguishable from a DOS attack and will in any case incur huge fees. The decision to make fees increase exponentially might have sounded nice a while back, but without auditing software downstream to add warnings this can lead to disasters as this one.
  10. Higher Transaction Costs

    Not hard at all, you need to update only 3 or 4 nodes and they are all owned by Ripple Inc. No, this data is being crawled manually by a separate project and validators are not storing any historic data in regards to validations. If you have any indications otherwise, please tell me in which database (table) this data is stored on my rippled server...
  11. Higher Transaction Costs

    Well, for over a month now their "validator registry" that tracks conformance of validators has not been showing new data with not even a hint of acknowledgment. That's a quite long downtime for maintenance, especially considering they plan to use these tools to decide upon who gets on their UNL and gains influence over the contents of the ledger... About the issue at hand: My suspicion is that the official validators are running up against some hardware limits on their hosting platform (e.g. they can't access their storage backend fast enough to keep up with the current ledgers), but this is hard to see from the outside. One way to verify would be increasing disagreement rates of official validators - which seems to be the case, but the data is cut off in August: https://xrpcharts.ripple.com/#/validators
  12. Are we really decentralised?

    The opposite example also exists, Google has some nice graphs about Chrome versions using automatic updates. A browser is different from a cryptocurrency validator though and I agree that there should be far more deliberation involved when deciding on features being enabled, UNLs being chosen and configurations being tested and deployed.
  13. Are we really decentralised?

    How could this be made secure in any way? A recent example on why this is a bad idea: http://blog.talosintelligence.com/2017/09/avast-distributes-malware.html Notifications about updates being available would be nice I guess.
  14. Why are account ids digests?

    It is not very hard to find out how many accounts never transferred any funds. Hashing is extremely cheap anyways, another advantage is that it makes the account identifier a little bit shorter and predictable across different crypto schemes (RCL supports 2 different schemes nowadays). Also not only hashing but also base58 encoding is performed, which includes a few extra characters to make sure that typos are being caught - also an issue if you were always inputting raw public key.