What is forking and how does it impact cryptocurrencies
What is forking and how does it impact cryptocurrencies
What Does Forking Mean for the Future of Bitcoin
What Is A Bitcoin Hard Fork - Explained | Bitcoin Insider
How Would a Hard Fork Affect Bitcoin’s - Coin Journal
Bitcoin XT vs Core, Blocksize limit, the schism that
What mechanism does Blockstream & co use to disrupt competing developer groups?
As most here know, Blockstream & co have a stranglehold on the Core development team which manages the node software known as 'Core' currently in use by a majority of the Bitcoin network. I can easily imagine the combination of threats or payoffs that it would take to accomplish that feat, but what I can't so easily understand is through what mechanism they prevent the rise of competing development groups. It seems pretty clear to me that the solution for resolving the hostile developer takeover of Core is to simply decide as a network to stop using Core software. Forking away from rouge developers was always the mechanism for maintaining decentralized control of the chain, but some people took that to mean forking away from the entire chain, which is asinine. Currently a very small group of corruptable humans have complete control over a single github repo and node implementation called Core. The Bitcoin network itself is made up of millions of individuals from all around the world who currently choose to mine with software released by the Core group, but we could just as easily choose to run software released by another group. We've seen some pretty solid attempts at node diversification in the past - first with XT, then Classic, then Unlimited. Each of those node implementations made a significant dent in the node ratio on the Bitcoin network, but each eventually fell out of favor. Then, with the addition of replay protection on the ABC node software, BCH forked away from the main network and with it all of the competing node software. I can't help but think that this was an intentional ploy by the same powers behind Blockstream to divert any competing developer groups away from the main chain, so that Blockstream and Core could maintain control over the Bitcoin network unopposed. So that leaves us with the question, what prevents the rise of successful developer groups on the Bitcoin network? I know that competing node implementations have been DDOS attacked in the past, but is that it? Is that the whole strategy? Could that single mechanism be responsible for the uncontested control Core currently enjoys over the Bitcoin node ratio, or is there something else? Why did XT, Classic, and Unlimited all fail to gain majority use on the network, and how can we ensure that the next attempt is successful where past attempts have failed. I know that many will respond here with the opinion that BCH has solved this problem, but really that just denies the issue. If Core could be so easily compromised, then so could ABC or Unlimited. Even if the whole world were to move to BCH without a solution to this underlying problem the entire takeover will simply be repeated. We need to set the precedent of forking away from a rouge node implementation and we need to do it on the main chain. This is the only real solution.
I have enjoyed watching Cobra slowly change his public position on several things. It's never too late to make up for ones mistakes in my eyes! Still, maybe this is a fun opportunity to revisit some of the highlights of Cobra's tenure over Bitcoin.org. 15 June 2015: Bitcoin.org introduces an official ‘position on hard-forks,’ which meant it would not promote XT. Bitcoin.org will not promote software or services that will leave the previous consensus because of a contentious hard fork attempt. 26 December 2015: Cobra makes a Github issue to remove Coinbase from the “choose your wallet” page on Bitcoin.org to punish them for running a Bitcoin XT node in addition to their Bitcoin Core node. 01 July 2016: Cobra posts on Github that he wants to update the Bitcoin whitepaper because it confuses people. 03 July 2017: An issue is open on the Bitcoin.org Github account titled Removal of BTC.com wallet? Cobra replies that he doesn’t mind if they are removed from the website as punishment because of their association with Jihan Wu and therefore, big block efforts. They’re associated with that monster Jihan Wu, so I don’t mind if they get removed because of this, they’re terrible people. I definitely feel like a line has been crossed here. 28 September 2017: Cobra opens a Github issue for Bitcoin.org labeled Add Segwit2x Safety Alert. The alert ostensibly warns users of hostile companies, but the list includes most of the oldest, most successful Bitcoin companies, and the real goal seems to be to scare or punish these companies for their stance on Segwit 2x. 08 November 2017: Theymos creates a Github issue titled Policy to fight against “miners control Bitcoin” narrative. Hilariously, Cobra replies in agreement and blames this misconception on the white paper itself. 11 August 2018: Cobra creates a Github issue to discuss relisting companies that were removed from Bitcoin.org for supporting the 2mb hard fork block size increase. His reasoning is that the delisting worked to stop the increase and that these companies are unlikely to try it again. You will be missed Cobra! If you are interested in appointing a member of Bitcoin.org who will be more open to competing ideas, I invite you to DM me. I am very familiar with the Jekyll static site generator you're currently using.
A Guide To The BCH Fork on November 15th - Be Informed!
BCH November 15th Forking Guide
Intro As you may have heard, on 15th November 2018 the Bitcoin Cash Blockchain will fork into at least two separate chains. We felt it our duty to provide information to the community on the situation that we hope will offer some clarity on this rather complex situation.
What Is A Fork? A fork occurs when at least one group of miners decide to follow a separate set of rules from the current consensus protocol. Due to the way bitcoin is designed, these miners will then operate on a separate network from the current network. This was in fact how Bitcoin Core and Bitcoin Cash was created from the original Bitcoin. Both changed the consensus rules in different ways that made them incompatible. To make the current situation slightly more complex, there are to be two sets of miners that are changing the protocol rules away from the current protocol. It is not expected that the currently operating consensus rules will be in operation by any significant set of miners after November 15th. This means that after November 15th there will be two new sets of competing protocol rules. For simplicity these will be described as the BitcoinABC ruleset and the BitcoinSV ruleset (although other implementations such as Bitcoin Unlimited, bcash, bchd, BitcoinXT and bitprim all also have the ABC consensus ruleset). This is quite a unique fork situation as one side (BitcoinSV) has indicated that they will be willing to attack their competition (BitcoinABC) using reorgs and doublespends to destabilise and reduce confidence in it.
BitcoinABC Fork Details The main new features in the BitcoinABC that make it incompatible with the current protocol are CTOR and DSV. To summarise: CTOR (Canonical Transaction Ordering) is a technology that allows blocks to be transmitted in a much more efficient way. This means that as blocks become larger as the network gains more adoption, the hardware and bandwidth requirements on nodes is decreased. This reduces centralisation pressures and allows us to scale the network with fewer adverse effects. You can read more about CTOR in this excellent ARTICLE by u/markblundeberg. DSV (CheckDataSigVerify) is a technology that allows oracles directly on the Bitcoin blockchain. This means that the transactions on the Bitcoin blockchain can be dependent on actions that happen in the real world. For example you could bet on the weather tomorrow, or if a specific candidate wins an election, all directly on the blockchain. You can read more about DSV at this excellent ARTICLE by u/mengerian.
BitcoinSV Fork Details The main new features in the BitcoinSV that make it incompatible with the current protocol are an increase in the default block size limit to 128MB, increase of the 201 opcode limit within Bitcoin’s script system to a maximum of 500 opcodes, and a new set of opcodes including; OP_MUL, OP_LSHIFT, OP_RSHIFT, OP_INVERT. The increase in the default block size limit will in theory allow miners on the BitcoinSV ruleset to produce and propagate blocks up to 128MB in size. It may be the case that the current state of the network cannot handle, or at least sustain, 128MB blocks but this will allow miners to decide if they want to try and produce blocks over 32MB (the current protocol limit agreed upon by miners). Increasing the opcode limit will allow miners to make transactions using scripts of larger lengths. This means that more complex scripts can be developed. The new opcodes allow new operations to happen within the Bitcoin scripting system.
What Are Your Options? When the fork happens your coins will become available on both chains. This is because both chains will share the same blockchain history up until the point the fork occurs. Things are unfortunately not quite as simple as that (when are they ever in cryptoland?). Transactions that would be valid on both chains will occur on both chains. Your transactions will be considered valid on both chains as long as you do not use any of the exclusive features from either ruleset, or use inputs from transactions that are considered invalid on one of the chains. You can alternatively split your coins so that you can control them exclusively on each chain. So what should you do? We won’t recommend what you should do with your own money, and this is not financial advice, but here are some of your options.
Do Nothing and HODL The simplest option is to keep your Bitcoin Cash in a wallet you control and wait for things to blow over. Make sure you have the private keys and or the seed written down in at least one place to be able to recover your funds if needed. As long as you do not move your funds they will be available on both chains after the fork. Risks - Price volatility. Like always the price can go up and down any amount. Only risk what you can afford to lose.
Sell BCH for Fiat Another simple option is to sell your BCH for fiat. This means moving your Bitcoin Cash to an exchange such as Bitstamp.net, Kraken.com or Coinbase, and then selling them for a fiat currency. You may also consider then withdrawing your funds to your bank account for extra security (exchanges have been known to implode with everyone’s funds every now and again). Risks - If the BCH price increase while you hold fiat your BCH holdings will be less if and when you buy back. Exchanges and banks can confiscate your money if they like (that why love Bitcoin remember). By selling you may also be liable for taxes in your jurisdiction.
Split Your Coins and HODL If you want to be ready for anything then you can split your coins after the fork occurs. This means that you will be able to control your coins exclusively on each chain. You will still need to make sure you have your wallet(s) backed up and have the private keys and seeds written down somewhere. To split your coins you can use a tool developed on Electron Cash HERE. This is unfortunately not a simple tool to use right now. Make sure to read the tips and advice given in that thread. You can also use http://forkfaucet.cash/ to receive a tiny amount of split coins to your address(es) so that they will become split once you spend from them. Risks - This has the same risks as simply HODLing your BCH. You should also be aware that some services have decided to refuse to use split coins during the fork. This means that if you send them split coins they will not allow you to spend them. These services include: Yours.org, moneybutton, HandCash, CentBee and CoinText.
Split Your Coins and Sell Some If you interested in gambling on which chain will be more successful you can split your coins using the method above, and can then send coins from either chain to an exchange that allows buying and selling of specific sides of the chain. Most exchanges have decided to close deposits and withdrawals of BCH and even trading of BCH until the outcome of the forks have become more clear. After the fork occurs exchanges will likely make announcements about whether which chain they will support (potentially both), and you will then be able to trade each fork as separate cryptocurrencies. Risks - By selling your coins on one of the chains you will no longer be invested in that side of the fork. In the case that one side of the fork ceases to exist and you are only holding coins on that side, you will have lost that money. By selling you may also be liable for taxes in your jurisdiction.
Summary It is unfortunate that Bitcoin Cash has to go through a fork without unanimous consensus on the new protocol rules. The unique situation with this fork, in particular, has presented some interesting new issues, and it is likely that we as a community will learn a lot from it. We hope that in similar situations in the future that the major entities in the industry, including miners, developers, businesses and community leaders can come together to find compromise that keeps the ecosystem stable and focused on adoption. Further Resources You can get more information at bitcoincash.org, bitcoinabc.org, bitcoinsv.io, and bitcoin.com. If you have further questions about this or just want to discuss the fork in general, we encourage you to join our chat at bitcoincashers.org/chat and join the conversation.
Bitcoin Verde: A New Consensus Full-Node Implementation for BCH
For the past year I have been working on a full-node in Java, completely from scratch. Today, after so much research, work, communication and testing, I am very happy to release the first beta version of Bitcoin Verde--on the genesis block's 10th birthday, no less! Bitcoin Verde is a ground-up implementation of the Bitcoin (Cash) (BCH) protocol. This project is a full node, blockchain explorer, and library. In the past, lack of a diversified development team and node implementation have caused bugs to become a part of the protocol. BCH currently has roughly three common full-node implementations (Bitcoin ABC, Bitcoin XT, Bitcoin Unlimited). However, these implementations are forked versions of Bitcoin Core, which means they may share the same (undiscovered) bugs. With a diverse network of nodes, bugs in the implementation of the protocol will result in incompatible blocks between the nodes, causing a temporary fork. This situation is healthy for the network in the long term, as the temporary forks will resolve over time and the intended ruleset becoming the consensus. Bitcoin Verde approaches many of the design decisions made by the reference client very differently--most prominently, Bitcoin Verde stores the entire blockchain in its database, not just the UTXOs. Because of this, reorgs are handled very differently and it's even possible to validate multiple forks at the same time. In fact, you can view http://bitcoinverde.org/blockchain/ to view some of the forks our instance has encountered. The node considers the chain matching its consensus rules and having the most PoW to be its "head" chain. I've spent a lot of time talking with the Bitcoin XT group to attempt to stay in-step with their consensus rules as much as possible, and it my goal to ensure we are diversifying the implementation of the network, NOT separating it. Because of that, please be sure to treat this release as a beta. Currently Bitcoin Verde does not have a mining-pool module finished, but once confidence has been raised about the consistency of the rulesets, this is a feature we intend on implementing and Bitcoin Verde will become a mining full node. Every component is multithreaded, including networking, validating, mempool acceptance, etc. It is my hope that during the next network stress-test, Bitcoin Verde can help to gather statistics on forks, transactions per second, and block/tx propagation time. Bitcoin Verde has its drawbacks: it's a resource-hog. Since the whole blockchain is indexed, the disk footprint is about 600GB. Initial-block-download memory usage is configurable, but is about 4 GB, (1.5 GB for the database + 1/2 GB for the node + 1 GB for the tx-bloom filter + 1 GB for the UTXO cache). Another drawback is that Bitcoin Verde "does more stuff"--it is essentially a block explorer, and because of that, the initial block download takes about 2-4 days to index all of chain and its addresses. Bitcoin Verde has been tested for weeks on Linux (Debian) and OS X. The node has not been tested well on Windows and it may in fact not even sync fully (only a Windows issue, currently). If you're a Windows user and are tech-savvy, feel free to give it a go and report any issues. I wanted to give my thanks to the Bitcoin XT team for being so welcoming of me. You're a great group of guys, and thanks for the conversations. Explorer: http://bitcoinverde.org Source: https://github.com/softwareverde/bitcoin-verde Documentation: http://bitcoinverde.org/documentation/ Forks: http://bitcoinverde.org/blockchain/ Node Status: http://bitcoinverde.org/status/
BCH November 15th Forking Guide Intro As you may have heard, on 15th November 2018 the Bitcoin Cash Blockchain will fork into at least two separate chains. We felt it our duty to provide information to the community on the situation that we hope will offer some clarity on this rather complex situation. What Is A Fork? A fork occurs when at least one group of miners decide to follow a separate set of rules from the current consensus protocol. Due to the way bitcoin is designed, these miners will then operate on a separate network from the current network. This was in fact how Bitcoin Core and Bitcoin Cash was created from the original Bitcoin. Both changed the consensus rules in different ways that made them incompatible. To make the current situation slightly more complex, there are to be two sets of miners that are changing the protocol rules away from the current protocol. It is not expected that the currently operating consensus rules will be in operation by any significant set of miners after November 15th. This means that after November 15th there will be two new sets of competing protocol rules. For simplicity these will be described as the BitcoinABC ruleset and the BitcoinSV ruleset (although other implementations such as Bitcoin Unlimited, bcash, bchd, BitcoinXT and bitprim all also have the ABC consensus ruleset). This is quite a unique fork situation as one side (BitcoinSV) has indicated that they will be willing to attack their competition (BitcoinABC) using reorgs and doublespends to destabilise and reduce confidence in it. BitcoinABC Fork Details The main new features in the BitcoinABC that make it incompatible with the current protocol are CTOR and DSV. To summarise: CTOR (Canonical Transaction Ordering) is a technology that allows blocks to be transmitted in a much more efficient way. This means that as blocks become larger as the network gains more adoption, the hardware and bandwidth requirements on nodes is decreased. This reduces centralisation pressures and allows us to scale the network with fewer adverse effects. You can read more about CTOR in this excellent ARTICLE by u/markblundeberg. DSV (CheckDataSigVerify) is a technology that allows oracles directly on the Bitcoin blockchain. This means that the transactions on the Bitcoin blockchain can be dependent on actions that happen in the real world. For example you could bet on the weather tomorrow, or if a specific candidate wins an election, all directly on the blockchain. You can read more about DSV at this excellent ARTICLE by u/mengerian. BitcoinSV Fork Details The main new features in the BitcoinSV that make it incompatible with the current protocol are an increase in the default block size limit to 128MB, increase of the 201 opcode limit within Bitcoin’s script system to a maximum of 500 opcodes, and a new set of opcodes including; OP_MUL, OP_LSHIFT, OP_RSHIFT, OP_INVERT. The increase in the default block size limit will in theory allow miners on the BitcoinSV ruleset to produce and propagate blocks up to 128MB in size. It may be the case that the current state of the network cannot handle, or at least sustain, 128MB blocks but this will allow miners to decide if they want to try and produce blocks over 32MB (the current protocol limit agreed upon by miners). Increasing the opcode limit will allow miners to make transactions using scripts of larger lengths. This means that more complex scripts can be developed. The new opcodes allow new operations to happen within the Bitcoin scripting system. What Are Your Options? When the fork happens your coins will become available on both chains. This is because both chains will share the same blockchain history up until the point the fork occurs. Things are unfortunately not quite as simple as that (when are they ever in cryptoland?). Transactions that would be valid on both chains will occur on both chains. Your transactions will be considered valid on both chains as long as you do not use any of the exclusive features from either ruleset, or use inputs from transactions that are considered invalid on one of the chains. You can alternatively split your coins so that you can control them exclusively on each chain. So what should you do? We won’t recommend what you should do with your own money, and this is not financial advice, but here are some of your options. Do Nothing and HODL The simplest option is to keep your Bitcoin Cash in a wallet you control and wait for things to blow over. Make sure you have the private keys and or the seed written down in at least one place to be able to recover your funds if needed. As long as you do not move your funds they will be available on both chains after the fork. Risks - Price volatility. Like always the price can go up and down any amount. Only risk what you can afford to lose. Sell BCH for Fiat Another simple option is to sell your BCH for fiat. This means moving your Bitcoin Cash to an exchange such as Bitstamp.net, Kraken.com or Coinbase, and then selling them for a fiat currency. You may also consider then withdrawing your funds to your bank account for extra security (exchanges have been known to implode with everyone’s funds every now and again). Risks - If the BCH price increase while you hold fiat your BCH holdings will be less if and when you buy back. Exchanges and banks can confiscate your money if they like (that why love Bitcoin remember). By selling you may also be liable for taxes in your jurisdiction. Split Your Coins and HODL If you want to be ready for anything then you can split your coins after the fork occurs. This means that you will be able to control your coins exclusively on each chain. You will still need to make sure you have your wallet(s) backed up and have the private keys and seeds written down somewhere. To split your coins you can use a tool developed on Electron Cash HERE. This is unfortunately not a simple tool to use right now. Make sure to read the tips and advice given in that thread. Risks - This has the same risks as simply HODLing your BCH. You should also be aware that some services have decided to refuse to use split coins during the fork. This means that if you send them split coins they will not allow you to spend them. These services include: Yours.org, moneybutton, HandCash, CentBee and CoinText. Split Your Coins and Sell Some If you interested in gambling on which chain will be more successful you can split your coins using the method above, and can then send coins from either chain to an exchange that allows buying and selling of specific sides of the chain. Most exchanges have decided to close deposits and withdrawals of BCH and even trading of BCH until the outcome of the forks have become more clear. After the fork occurs exchanges will likely make announcements about whether which chain they will support (potentially both), and you will then be able to trade each fork as separate cryptocurrencies. Risks - By selling your coins on one of the chains you will no longer be invested in that side of the fork. In the case that one side of the fork ceases to exist and you are only holding coins on that side, you will have lost that money. By selling you may also be liable for taxes in your jurisdiction. Summary It is unfortunate that Bitcoin Cash has to go through a fork without unanimous consensus on the new protocol rules. The unique situation with this fork, in particular, has presented some interesting new issues, and it is likely that we as a community will learn a lot from it. We hope that in similar situations in the future that the major entities in the industry, including miners, developers, businesses and community leaders can come together to find compromise that keeps the ecosystem stable and focused on adoption. If you have further questions about this or just want to discuss the fork in general, we encourage you to join our chat at bitcoincashers.org/chat and join the conversation.
I've been interested in this fork which is happening this Friday. This fork is unique and to my knowledge the only coin to hard fork where there are two different sets of changes, not one set of changes that the miners disagree / agree on like was the case with the original bitcoin cash fork. What are your thoughts on Bitcoin ABC and Bitcoin SV? it seems the community is quite split on this. SV proponents claim that this is sticking to Satoshi's ethos for bitcoin, it is strictly for payment, brings back some old BTC opcodes increases the potential block size to 128mb. While ABC, along with CTOR, will allow a new opcode OP_CHECKDATASIG which will permit the validation of messages from outside the blockchain. This will enable uses such as the use of oracles and cross-chain atomic contracts. This new opcode appears to be the most contentious issue surrounding ABC (are there more?) There is quite a lot of drama around this, it will be an interesting event https://cointelegraph.com/news/bitcoin-cash-drama-battle-lines-drawn-ahead-of-scheduled-hard-fork edit: i copied the wrong post
A reminder of who Craig Wright is and the benefits to BCH now he has gone.
This needs to be repeated every so often on this subreddit so new people can understand the history of the fork of BCH into BCH and BSV From Jonald Fyookball's article https://medium.com/@jonaldfyookball/bitcoin-cash-is-finally-free-of-faketoshi-great-days-lie-ahead-bb0c833e4c5d Craig S. Wright (CSW) leaving the Bitcoin Cash community is a wonderful thing. This self-described “tyrant” has been expunged, and now we can get back to our mission of bringing peer-to-peer electronic cash to the world. The markets will rebound when they see the chaos is over, but regardless of the price, we will keep building. Nothing will stop the sound money movement. Calling Out Bad Behavior As Rick Falkvinge recently explained, there is a difference between small-minded gossiping about personalities and legitimately calling out bad behavior. CSW’s bad behavior must be called out, because he has done tremendous damage to Bitcoin Cash (and possibly even the entire cryptocurrency sector). The brief history is that he gained his reputation by claiming to be Bitcoin’s creator (Satoshi Nakamoto). He said he would provide “extraordinary proof” but he has never done so. Supposedly, he did some “private signings” to a few people, and this allowed him to gain influence in the BCH community. The destruction he has been causing was not widely recognized until after a huge mess had been made. Thanks to u/Contrarian__ for the following compliation of CSW’s misgivings: Some background on Craig’s claim of being Satoshi, for the uninitiated:
He faked blog posts He faked PGP keys He faked contracts and emails He faked threats He faked a public key signing He has a well-documented history of fabricating things bitcoin and non-bitcoin related He faked a bitcoin trust to get free money from the Australian government but was caught and fined over a million dollars.
And specifically concerning his claim to be Satoshi:
He has provided no independently verifiable evidence He is not technically competent in the subject matter His writing style is nothing like Satoshi’s He called bitcoin “Bit Coin” in 2011 when Satoshi never used a space He actively bought and traded coins from Mt. Gox in 2013 and 2014 He was paid millions for ‘coming out’ as Satoshi as part of the deal to sell his patents to nTrust — for those who claim he was ‘outed’ or had no motive
Caught Red Handed Plagiarizing No respectable academic, scientist, or professional needs to stoop so low as to steal and take credit for the work of others — least of all Satoshi. Yet, CSW has already been caught at least 3 times plagiarizing.
His paper on selfish mining has full sections copied almost verbatim from a paper written by Liu & Wang. His “Beyond Godel” paper which purports to claim that Bitcoin script is turing complete, is heavily plagiarized. A paper on block propagation was blatantly and intentionally plagiarized.
Can’t Even Steal Code Correctly CSW was also caught attempting to plagiarize a “hello world” program (the simplest of all computer programs). He apparently does not understand base58 or how Bitcoin address checksums work (both of these are common knowledge to experienced Bitcoiners), and has made other embarrasing errors. So How Did Such an Obvious Fraud Gain So Much Power and Influence? There are no easy answers here. It seems that as humans, we are very susceptible to manipulation and misinformation. The greatest weapon against sinister forces is a well-educated populace. This is something that can only improve over the long run. The “Satoshi factor” is a powerful one and appeals to the glamorization of a mythical figure. Even people such as myself, who are technically astute, gave CSW all benefit of the doubt until the evidence staring us in the face could no longer be denied. The seduction of the BCH community was also facilitated by CSW becoming a strong advocate for the on-chain/big-block scaling movement at a time when the community was dying to hear it. This message, delivered with a brazen, in-your-face style, was a sharp contrast to anything seen before. In addition, CSW was able to find obscure topics (“2pda”), network topology, etc, that seemed to establish him as an expert with esoteric knowledge above and beyond anyone else. Basically, he was using technobabble, but it wasn’t immediately obvious except to very technical people… who were then attacked and discredited. Eventually, as more and more of the community began to realize his technical claims were bogus, CSW banned those people from his twitter feed and slack channel, leaving only a group of untechnical “believers”, which the larger BCH community referred to as “the church” AKA the Cult-of-Craig. Finally, if some believed that CSW possesed Satoshis’s stash of 1M BTC, then they may have been gnawing to get a piece of it. But it may turn out that these are the coins that never were. Broken Promises If this article so far seems like an “attack piece” on CSW, remember it is important to get all the facts out in the open. We’ll get to the silver lining and bright future in a moment… but let’s continue here to “get it all out”. One of the biggest ways that CSW has damaged the community is to make an endless series of broken promises. This caused others to wait, to waste time on his unproven ideas and solutions, and to postpone or drop their own ideas and initiatives.
He said he was building a mining pool to “stop SegWit” He said he was bringing big companies to use the BCH chain He said that he was providing a fungibility solution based on blind threshold signatures He said he was providing novel technology based on oblivious transfers He said he was providing a method where people could do atomic swaps without using timelocks He said he was going to show everyone how we can do bilinear pairings using secp256k1 He said he was going to release source code for nakasendo He said he was releasing some information that would “kill the lightning network” He said he was going to show everyone how the selfish mining theory is wrong He said he was going to show everyone how we can tokenize everything in the universe squared He said a few times “big things are coming in 2 months”
How CSW Has Damaged the BCH Community In addition to the broken promises, the BCH community was wounded due to:
The division of the community (with classic divide and conquer tactics) Loss of focus. Huge amounts of drama and distraction from building and adoption Investor confidence has been shaken due to uncertainty and chaos. BCH is a laughing stock to outsiders due to CSW’s antics Gemini deployment of BCH and other rollouts paused Loss of developer talent due to toxic and abrasive personality Various patent and legal threats
The Hash War Event and Split into BitcoinSV Every 6 months, BCH has a scheduled network upgrade. This is technically a “hard fork” but a non-contentious fork does not result in a split of the chain — it is simply new network rules being activated. Bitcoin Cash has multiple independent developer groups including Bitcoin ABC, Bitcoin Unlimited, Bitcoin XT, Bitprim, BCHD, bcash, parity, Flowee, and others. The nChain group, led by CSW, introduced an alternate set of changes a week before the agreed cut-off date, intentionally causing a huge controversey. These changes were incompatible with the changes being discussed between the other groups. nChain objected to the changes being proposed (cannonical transaction ordering) despite specifically agreeing to it almost a year earlier. The last minute objections were in my opinion, an attempt at sabotage. An emergency meeting was held in Bangkok to attempt to resolve the differences between the nChain group and the rest of the community. Not only did CSW refuse to listen to the other presentations, he walked out of the meeting after his own speech had been given. The other nChain people refused to discuss the technical issues. After this, nChain built their own software (“BitcoinSV”) to attempt to compete for the Bitcoin Cash network. But rather than split off to follow their own set of rules, they threatened to attack Bitcoin Cash. Their attitude was “you follow our rules or we burn it all down”. The CSW sycophants adopted a strange interpretation of the Bitcoin whitepaper and proselytized the idea that if nChain could “out hash” everyone else, the market should be obliged to follow them. This faulty thinking was eloquently debunked by u/CatatonicAdenosine. As it turns out, nChain was unable in any case to win at their own game. But Here’s the Obviously Good News… CSW is gone. It’s over. He can do whatever he wants on the BitcoinSV chain. He will never be allowed to influence Bitcoin Cash again. And all the negative things and negative people that were a consequence of his involvement in Bitcoin Cash are gone with him. As a community, we will redouble our efforts and get back to our mission of peer-to-peer electronic cash. We will learn to work together better than ever, and we will learn to detect and punish bad behavior sooner. The attempted attacks with hashpower also sparked innovation and a focus on the problem of how to stop such attacks in the future. This is only making Bitcoin Cash (BCH) and the entire class of Proof-of-Work coins stronger. Nothing will stop us. The reason why millions of dollars were spent to attack and also to defend Bitcoin Cash is because it’s something truly worth fighting over. It’s sound money. It’s permissionless. It’s what Satoshi Nakamoto wrote about in 2008. It’s Bitcoin, a Peer-to-Peer Electronic Cash System.
Go to the profile of Jonald Fyookball Jonald Fyookball More from Jonald Fyookball Jimmy Song Tries to Claim Bitcoin Cash is “Fiat Money”… Seriously? Go to the profile of Jonald Fyookball Jonald Fyookball Related reads 600 Microseconds Go to the profile of Awemany Awemany Related reads The scams in Crypto Go to the profile of Craig Wright (Bitcoin SV is the original Bitcoin.) Craig Wright (Bitcoin SV is the original Bitcoin.) Responses
A personal opinion with a collection of links and quotes
I don't take much joy in writing this post, however, with the upcoming fork and all the drama surrounding it, I felt compelled to do so. One thing I have advocated over the years along with many others in this space is to judge ideas based on their merit, and not based on the person presenting the idea. However, it's crucial that along with this general rule of thumb, that we as humans also align with our own philosophical ideas, morals, and ethics when we make decisions. Otherwise we end up with a conflict of our own self-interests; i.e., cognitive dissonance. For example, let's just say I'm completely against the state. For this example, let's say I'm also an anarchist. Hypothetically speaking, someone presents an idea that is technically sound, and is overall an amazing idea by itself. I may like it a lot! However, I find out later the person presenting the idea is completely pro-state, and has made statements that he will use this idea in order to promote statist ideas and agendas. Even though the idea itself is sound and good, I know that the person presenting the idea has different principles than me that are in so much conflict with my own philosophies in life...that I will then begin to discount the idea -- not because the idea itself is bad -- but because I know the person behind the idea will use it in ways that don't align with my own personal life views. Another thing I've advocated over the years is to think critically, independently, and have an open mind. I believe I've stayed true to this, and this is exactly what I am doing here. Bitcoin is built by humans, and is not artificial intelligence (at least not yet). This means, although ideas alone can have merit, we must also consider all the factors that go into an idea and how that idea will be used. If this goes into conflict with our life views, then we need to consider that as well when evaluating ideas. Below are a collection of links and quotes of Craig Wright, in just the order I found them and they present the following:
This person wants to be the King of Bitcoin, the sole ruler
This person wants full control of Bitcoin; if they could control 100% of hashing, he would
This person has no care for decentralization
This person does not care about anonymity at all
This person does not want permissionless innovation
This person cares more about the state than individual freedoms
This person is a patent troll who will undoubtedly use his patents for evil
This person is a liar (see plagiarism and previous claims to being Satoshi)
This person is pro-censorship (believes in blacklisting transactions and censoring discussion forums)
This person does not believe in unity and is dividing and fracturing us with the goal of gaining control
This person does not care about you or I, and certainly not the economic freedom of the world
I was planning to submit a pull request to the 0.11 release of Bitcoin Core that will allow miners to create blocks bigger than one megabyte, starting a little less than a year from now. But this process of peer review turned up a technical issue that needs to get addressed, and I don’t think it can be fixed in time for the first 0.11 release. I will be writing a series of blog posts, each addressing one argument against raising the maximum block size, or against scheduling a raise right now... please send me an email ([email protected]) if I am missing any arguments
In other words, Gavin proposed a hard fork via a series of blog posts, bypassing all developer communication channels altogether and asking for personal, private emails from anyone interested in discussing the proposal further. On May 5 (1 day after Gavin submitted his first blog post), Mike Hearn published The capacity cliff on his Medium page. 2 days later, he posted Crash landing. In these posts, he argued:
A common argument for letting Bitcoin blocks fill up is that the outcome won’t be so bad: just a market for fees... this is wrong. I don’t believe fees will become high and stable if Bitcoin runs out of capacity. Instead, I believe Bitcoin will crash. ...a permanent backlog would start to build up... as the backlog grows, nodes will start running out of memory and dying... as Core will accept any transaction that’s valid without any limit a node crash is eventually inevitable.
He also, in the latter article, explained that he disagreed with Satoshi's vision for how Bitcoin would mature:
Neither me nor Gavin believe a fee market will work as a substitute for the inflation subsidy.
Gavin continued to publish the series of blog posts he had announced while Hearn made these predictions.  Matt Corallo brought Gavin's proposal up on the bitcoin-dev mailing list after a few days. He wrote:
Recently there has been a flurry of posts by Gavin at http://gavinandresen.svbtle.com/ which advocate strongly for increasing the maximum block size. However, there hasnt been any discussion on this mailing list in several years as far as I can tell... So, at the risk of starting a flamewar, I'll provide a little bait to get some responses and hope the discussion opens up into an honest comparison of the tradeoffs here. Certainly a consensus in this kind of technical community should be a basic requirement for any serious commitment to blocksize increase. Personally, I'm rather strongly against any commitment to a block size increase in the near future. Long-term incentive compatibility requires that there be some fee pressure, and that blocks be relatively consistently full or very nearly full. What we see today are transactions enjoying next-block confirmations with nearly zero pressure to include any fee at all (though many do because it makes wallet code simpler). This allows the well-funded Bitcoin ecosystem to continue building systems which rely on transactions moving quickly into blocks while pretending these systems scale. Thus, instead of working on technologies which bring Bitcoin's trustlessness to systems which scale beyond a blockchain's necessarily slow and (compared to updating numbers in a database) expensive settlement, the ecosystem as a whole continues to focus on building centralized platforms and advocate for changes to Bitcoin which allow them to maintain the status quo
The point of the hard block size limit is exactly because giving miners free rule to do anything they like with their blocks would allow them to do any number of crazy attacks. The incentives for miners to pick block sizes are no where near compatible with what allows the network to continue to run in a decentralized manner.
I'm not so much opposed to a block size increase as I am opposed to a hard fork... I strongly fear that the hard fork itself will become an excuse to change other aspects of the system in ways that will have unintended and possibly disastrous consequences.
there has been significant public discussion... about why increasing the max block size is kicking the can down the road while possibly compromising blockchain security. There were many excellent objections that were raised that, sadly, I see are not referenced at all in the recent media blitz. Frankly I can't help but feel that if contributions, like those from #bitcoin-wizards, have been ignored in lieu of technical analysis, and the absence of discussion on this mailing list, that I feel perhaps there are other subtle and extremely important technical details that are completely absent from this--and other-- proposals. Secured decentralization is the most important and most interesting property of bitcoin. Everything else is rather trivial and could be achieved millions of times more efficiently with conventional technology. Our technical work should be informed by the technical nature of the system we have constructed. There's no doubt in my mind that bitcoin will always see the most extreme campaigns and the most extreme misunderstandings... for development purposes we must hold ourselves to extremely high standards before proposing changes, especially to the public, that have the potential to be unsafe and economically unsafe. There are many potential technical solutions for aggregating millions (trillions?) of transactions into tiny bundles. As a small proof-of-concept, imagine two parties sending transactions back and forth 100 million times. Instead of recording every transaction, you could record the start state and the end state, and end up with two transactions or less. That's a 100 million fold, without modifying max block size and without potentially compromising secured decentralization. The MIT group should listen up and get to work figuring out how to measure decentralization and its security.. Getting this measurement right would be really beneficial because we would have a more academic and technical understanding to work with.
When Bitcoin is changed fundamentally, via a hard fork, to have different properties, the change can create winners or losers... There are non-trivial number of people who hold extremes on any of these general belief patterns; Even among the core developers there is not a consensus on Bitcoin's optimal role in society and the commercial marketplace. there is a at least a two fold concern on this particular ("Long term Mining incentives") front: One is that the long-held argument is that security of the Bitcoin system in the long term depends on fee income funding autonomous, anonymous, decentralized miners profitably applying enough hash-power to make reorganizations infeasible. For fees to achieve this purpose, there seemingly must be an effective scarcity of capacity. The second is that when subsidy has fallen well below fees, the incentive to move the blockchain forward goes away. An optimal rational miner would be best off forking off the current best block in order to capture its fees, rather than moving the blockchain forward... tools like the Lightning network proposal could well allow us to hit a greater spectrum of demands at once--including secure zero-confirmation (something that larger blocksizes reduce if anything), which is important for many applications. With the right technology I believe we can have our cake and eat it too, but there needs to be a reason to build it; the security and decentralization level of Bitcoin imposes a hard upper limit on anything that can be based on it. Another key point here is that the small bumps in blocksize which wouldn't clearly knock the system into a largely centralized mode--small constants--are small enough that they don't quantitatively change the operation of the system; they don't open up new applications that aren't possible today the procedure I'd prefer would be something like this: if there is a standing backlog, we-the-community of users look to indicators to gauge if the network is losing decentralization and then double the hard limit with proper controls to allow smooth adjustment without fees going to zero (see the past proposals for automatic block size controls that let miners increase up to a hard maximum over the median if they mine at quadratically harder difficulty), and we don't increase if it appears it would be at a substantial increase in centralization risk. Hardfork changes should only be made if they're almost completely uncontroversial--where virtually everyone can look at the available data and say "yea, that isn't undermining my property rights or future use of Bitcoin; it's no big deal". Unfortunately, every indicator I can think of except fee totals has been going in the wrong direction almost monotonically along with the blockchain size increase since 2012 when we started hitting full blocks and responded by increasing the default soft target. This is frustrating many people--myself included--have been working feverishly hard behind the scenes on Bitcoin Core to increase the scalability. This work isn't small-potatoes boring software engineering stuff; I mean even my personal contributions include things like inventing a wholly new generic algebraic optimization applicable to all EC signature schemes that increases performance by 4%, and that is before getting into the R&D stuff that hasn't really borne fruit yet, like fraud proofs. Today Bitcoin Core is easily >100 times faster to synchronize and relay than when I first got involved on the same hardware, but these improvements have been swallowed by the growth. The ironic thing is that our frantic efforts to keep ahead and not lose decentralization have both not been enough (by the best measures, full node usage is the lowest its been since 2011 even though the user base is huge now) and yet also so much that people could seriously talk about increasing the block size to something gigantic like 20MB. This sounds less reasonable when you realize that even at 1MB we'd likely have a smoking hole in the ground if not for existing enormous efforts to make scaling not come at a loss of decentralization.
In short, without either a fixed blocksize or fixed fee per transaction Bitcoin will will not survive as there is no viable way to pay for PoW security. The latter option - fixed fee per transaction - is non-trivial to implement in a way that's actually meaningful - it's easy to give miners "kickbacks" - leaving us with a fixed blocksize. Even a relatively small increase to 20MB will greatly reduce the number of people who can participate fully in Bitcoin, creating an environment where the next increase requires the consent of an even smaller portion of the Bitcoin ecosystem. Where does that stop? What's the proposed mechanism that'll create an incentive and social consensus to not just 'kick the can down the road'(3) and further centralize but actually scale up Bitcoin the hard way?
I am - in general - in favor of increasing the size blocks... Controversial hard forks. I hope the mailing list here today already proves it is a controversial issue. Independent of personal opinions pro or against, I don't think we can do a hard fork that is controversial in nature. Either the result is effectively a fork, and pre-existing coins can be spent once on both sides (effectively failing Bitcoin's primary purpose), or the result is one side forced to upgrade to something they dislike - effectively giving a power to developers they should never have. Quoting someone: "I did not sign up to be part of a central banker's committee". The reason for increasing is "need". If "we need more space in blocks" is the reason to do an upgrade, it won't stop after 20 MB. There is nothing fundamental possible with 20 MB blocks that isn't with 1 MB blocks. Misrepresentation of the trade-offs. You can argue all you want that none of the effects of larger blocks are particularly damaging, so everything is fine. They will damage something (see below for details), and we should analyze these effects, and be honest about them, and present them as a trade-off made we choose to make to scale the system better. If you just ask people if they want more transactions, of course you'll hear yes. If you ask people if they want to pay less taxes, I'm sure the vast majority will agree as well. Miner centralization. There is currently, as far as I know, no technology that can relay and validate 20 MB blocks across the planet, in a manner fast enough to avoid very significant costs to mining. There is work in progress on this (including Gavin's IBLT-based relay, or Greg's block network coding), but I don't think we should be basing the future of the economics of the system on undemonstrated ideas. Without those (or even with), the result may be that miners self-limit the size of their blocks to propagate faster, but if this happens, larger, better-connected, and more centrally-located groups of miners gain a competitive advantage by being able to produce larger blocks. I would like to point out that there is nothing evil about this - a simple feedback to determine an optimal block size for an individual miner will result in larger blocks for better connected hash power. If we do not want miners to have this ability, "we" (as in: those using full nodes) should demand limitations that prevent it. One such limitation is a block size limit (whatever it is). Ability to use a full node. Skewed incentives for improvements... without actual pressure to work on these, I doubt much will change. Increasing the size of blocks now will simply make it cheap enough to continue business as usual for a while - while forcing a massive cost increase (and not just a monetary one) on the entire ecosystem. Fees and long-term incentives. I don't think 1 MB is optimal. Block size is a compromise between scalability of transactions and verifiability of the system. A system with 10 transactions per day that is verifiable by a pocket calculator is not useful, as it would only serve a few large bank's settlements. A system which can deal with every coffee bought on the planet, but requires a Google-scale data center to verify is also not useful, as it would be trivially out-competed by a VISA-like design. The usefulness needs in a balance, and there is no optimal choice for everyone. We can choose where that balance lies, but we must accept that this is done as a trade-off, and that that trade-off will have costs such as hardware costs, decreasing anonymity, less independence, smaller target audience for people able to fully validate, ... Choose wisely.
this list is not a good place for making progress or reaching decisions. if Bitcoin continues on its current growth trends it will run out of capacity, almost certainly by some time next year. What we need to see right now is leadership and a plan, that fits in the available time window. I no longer believe this community can reach consensus on anything protocol related. When the money supply eventually dwindles I doubt it will be fee pressure that funds mining What I don't see from you yet is a specific and credible plan that fits within the next 12 months and which allows Bitcoin to keep growing.
We've successfully reached consensus for several softfork proposals already. I agree with others that hardfork need to be uncontroversial and there should be consensus about them. If you have other ideas for the criteria for hardfork deployment all I'm ears. I just hope that by "What we need to see right now is leadership" you don't mean something like "when Gaving and Mike agree it's enough to deploy a hardfork" when you go from vague to concrete. Oh, so your answer to "bitcoin will eventually need to live on fees and we would like to know more about how it will look like then" it's "no bitcoin long term it's broken long term but that's far away in the future so let's just worry about the present". I agree that it's hard to predict that future, but having some competition for block space would actually help us get more data on a similar situation to be able to predict that future better. What you want to avoid at all cost (the block size actually being used), I see as the best opportunity we have to look into the future. this is my plan: we wait 12 months... and start having full blocks and people having to wait 2 blocks for their transactions to be confirmed some times. That would be the beginning of a true "fee market", something that Gavin used to say was his #1 priority not so long ago (which seems contradictory with his current efforts to avoid that from happening). Having a true fee market seems clearly an advantage. What are supposedly disastrous negative parts of this plan that make an alternative plan (ie: increasing the block size) so necessary and obvious. I think the advocates of the size increase are failing to explain the disadvantages of maintaining the current size. It feels like the explanation are missing because it should be somehow obvious how the sky will burn if we don't increase the block size soon. But, well, it is not obvious to me, so please elaborate on why having a fee market (instead of just an price estimator for a market that doesn't even really exist) would be a disaster.
No. What I meant is that someone (theoretically Wladimir) needs to make a clear decision. If that decision is "Bitcoin Core will wait and watch the fireworks when blocks get full", that would be showing leadership I will write more on the topic of what will happen if we hit the block size limit... I don't believe we will get any useful data out of such an event. I've seen distributed systems run out of capacity before. What will happen instead is technological failure followed by rapid user abandonment... we need to hear something like that from Wladimir, or whoever has the final say around here.
it is true that "universally uncontroversial" (which is what I think the requirement should be for hard forks) is a vague qualifier that's not formally defined anywhere. I guess we should only consider rational arguments. You cannot just nack something without further explanation. If his explanation was "I will change my mind after we increase block size", I guess the community should say "then we will just ignore your nack because it makes no sense". In the same way, when people use fallacies (purposely or not) we must expose that and say "this fallacy doesn't count as an argument". But yeah, it would probably be good to define better what constitutes a "sensible objection" or something. That doesn't seem simple though. it seems that some people would like to see that happening before the subsidies are low (not necessarily null), while other people are fine waiting for that but don't want to ever be close to the scale limits anytime soon. I would also like to know for how long we need to prioritize short term adoption in this way. As others have said, if the answer is "forever, adoption is always the most important thing" then we will end up with an improved version of Visa. But yeah, this is progress, I'll wait for your more detailed description of the tragedies that will follow hitting the block limits, assuming for now that it will happen in 12 months. My previous answer to the nervous "we will hit the block limits in 12 months if we don't do anything" was "not sure about 12 months, but whatever, great, I'm waiting for that to observe how fees get affected". But it should have been a question "what's wrong with hitting the block limits in 12 months?"
Transcript of the community Q&A with Steve Shadders and Daniel Connolly of the Bitcoin SV development team. We talk about the path to big blocks, new opcodes, selfish mining, malleability, and why November will lead to a divergence in consensus rules. (Cont in comments)
We've gone through the painstaking process of transcribing the linked interview with Steve Shadders and Daniell Connolly of the Bitcoin SV team. There is an amazing amount of information in this interview that we feel is important for businesses and miners to hear, so we believe it was important to get this is a written form. To avoid any bias, the transcript is taken almost word for word from the video, with just a few changes made for easier reading. If you see any corrections that need to be made, please let us know. Each question is in bold, and each question and response is timestamped accordingly. You can follow along with the video here: https://youtu.be/tPImTXFb_U8
Connor: 02:19.68,0:02:45.10 Alright so thank You Daniel and Steve for joining us. We're joined by Steve Shadders and Daniel Connolly from nChain and also the lead developers of the Satoshi’s Vision client. So Daniel and Steve do you guys just want to introduce yourselves before we kind of get started here - who are you guys and how did you get started? Steve: 0,0:02:38.83,0:03:30.61
So I'm Steve Shadders and at nChain I am the director of solutions in engineering and specifically for Bitcoin SV I am the technical director of the project which means that I'm a bit less hands-on than Daniel but I handle a lot of the liaison with the miners - that's the conditional project.
Hi I’m Daniel I’m the lead developer for Bitcoin SV. As the team's grown that means that I do less actual coding myself but more organizing the team and organizing what we’re working on.
Connor 03:23.07,0:04:15.98 Great so we took some questions - we asked on Reddit to have people come and post their questions. We tried to take as many of those as we could and eliminate some of the duplicates, so we're gonna kind of go through each question one by one. We added some questions of our own in and we'll try and get through most of these if we can. So I think we just wanted to start out and ask, you know, Bitcoin Cash is a little bit over a year old now. Bitcoin itself is ten years old but in the past a little over a year now what has the process been like for you guys working with the multiple development teams and, you know, why is it important that the Satoshi’s vision client exists today? Steve: 0:04:17.66,0:06:03.46
I mean yes well we’ve been in touch with the developer teams for quite some time - I think a bi-weekly meeting of Bitcoin Cash developers across all implementations started around November last year. I myself joined those in January or February of this year and Daniel a few months later. So we communicate with all of those teams and I think, you know, it's not been without its challenges. It's well known that there's a lot of disagreements around it, but some what I do look forward to in the near future is a day when the consensus issues themselves are all rather settled, and if we get to that point then there's not going to be much reason for the different developer teams to disagree on stuff. They might disagree on non-consensus related stuff but that's not the end of the world because, you know, Bitcoin Unlimited is free to go and implement whatever they want in the back end of a Bitcoin Unlimited and Bitcoin SV is free to do whatever they want in the backend, and if they interoperate on a non-consensus level great. If they don't not such a big problem there will obviously be bridges between the two, so, yeah I think going forward the complications of having so many personalities with wildly different ideas are going to get less and less.
Cory: 0:06:00.59,0:06:19.59 I guess moving forward now another question about the testnet - a lot of people on Reddit have been asking what the testing process for Bitcoin SV has been like, and if you guys plan on releasing any of those results from the testing? Daniel: 0:06:19.59,0:07:55.55
Sure yeah so our release will be concentrated on the stability, right, with the first release of Bitcoin SV and that involved doing a large amount of additional testing particularly not so much at the unit test level but at the more system test so setting up test networks, performing tests, and making sure that the software behaved as we expected, right. Confirming the changes we made, making sure that there aren’t any other side effects. Because of, you know, it was quite a rush to release the first version so we've got our test results documented, but not in a way that we can really release them. We're thinking about doing that but we’re not there yet.
Just to tidy that up - we've spent a lot of our time developing really robust test processes and the reporting is something that we can read on our internal systems easily, but we need to tidy that up to give it out for public release. The priority for us was making sure that the software was safe to use. We've established a test framework that involves a progression of code changes through multiple test environments - I think it's five different test environments before it gets the QA stamp of approval - and as for the question about the testnet, yeah, we've got four of them. We've got Testnet One and Testnet Two. A slightly different numbering scheme to the testnet three that everyone's probably used to – that’s just how we reference them internally. They're [1 and 2] both forks of Testnet Three. [Testnet] One we used for activation testing, so we would test things before and after activation - that one’s set to reset every couple of days. The other one [Testnet Two] was set to post activation so that we can test all of the consensus changes. The third one was a performance test network which I think most people have probably have heard us refer to before as Gigablock Testnet. I get my tongue tied every time I try to say that word so I've started calling it the Performance test network and I think we're planning on having two of those: one that we can just do our own stuff with and experiment without having to worry about external unknown factors going on and having other people joining it and doing stuff that we don't know about that affects our ability to baseline performance tests, but the other one (which I think might still be a work in progress so Daniel might be able to answer that one) is one of them where basically everyone will be able to join and they can try and mess stuff up as bad as they want.
Yeah, so we so we recently shared the details of Testnet One and Two with the with the other BCH developer groups. The Gigablock test network we've shared up with one group so far but yeah we're building it as Steve pointed out to be publicly accessible.
Connor: 0:10:18.88,0:10:44.00 I think that was my next question I saw that you posted on Twitter about the revived Gigablock testnet initiative and so it looked like blocks bigger than 32 megabytes were being mined and propagated there, but maybe the block explorers themselves were coming down - what does that revived Gigablock test initiative look like? Daniel: 0:10:41.62,0:11:58.34
That's what did the Gigablock test network is. So the Gigablock test network was first set up by Bitcoin Unlimited with nChain’s help and they did some great work on that, and we wanted to revive it. So we wanted to bring it back and do some large-scale testing on it. It's a flexible network - at one point we had we had eight different large nodes spread across the globe, sort of mirroring the old one. Right now we scaled back because we're not using it at the moment so they'll notice I think three. We have produced some large blocks there and it's helped us a lot in our research and into the scaling capabilities of Bitcoin SV, so it's guided the work that the team’s been doing for the last month or two on the improvements that we need for scalability.
I think that's actually a good point to kind of frame where our priorities have been in kind of two separate stages. I think, as Daniel mentioned before, because of the time constraints we kept the change set for the October 15 release as minimal as possible - it was just the consensus changes. We didn't do any work on performance at all and we put all our focus and energy into establishing the QA process and making sure that that change was safe and that was a good process for us to go through. It highlighted what we were missing in our team – we got our recruiters very busy recruiting of a Test Manager and more QA people. The second stage after that is performance related work which, as Daniel mentioned, the results of our performance testing fed into what tasks we were gonna start working on for the performance related stuff. Now that work is still in progress - some of the items that we identified the code is done and that's going through the QA process but it’s not quite there yet. That's basically the two-stage process that we've been through so far. We have a roadmap that goes further into the future that outlines more stuff, but primarily it’s been QA first, performance second. The performance enhancements are close and on the horizon but some of that work should be ongoing for quite some time.
Some of the changes we need for the performance are really quite large and really get down into the base level view of the software. There's kind of two groups of them mainly. One that are internal to the software – to Bitcoin SV itself - improving the way it works inside. And then there's other ones that interface it with the outside world. One of those in particular we're working closely with another group to make a compatible change - it's not consensus changing or anything like that - but having the same interface on multiple different implementations will be very helpful right, so we're working closely with them to make improvements for scalability.
Connor: 0:14:32.60,0:15:26.45 Obviously for Bitcoin SV one of the main things that you guys wanted to do that that some of the other developer groups weren't willing to do right now is to increase the maximum default block size to 128 megabytes. I kind of wanted to pick your brains a little bit about - a lot of the objection to either removing the box size entirely or increasing it on a larger scale is this idea of like the infinite block attack right and that kind of came through in a lot of the questions. What are your thoughts on the “infinite block attack” and is it is it something that that really exists, is it something that miners themselves should be more proactive on preventing, or I guess what are your thoughts on that attack that everyone says will happen if you uncap the block size? Steve: 0:15:23.45,0:18:28.56
I'm often quoted on Twitter and Reddit - I've said before the infinite block attack is bullshit. Now, that's a statement that I suppose is easy to take out of context, but I think the 128 MB limit is something where there’s probably two schools of thought about. There are some people who think that you shouldn't increase the limit to 128 MB until the software can handle it, and there are others who think that it's fine to do it now so that the limit is increased when the software can handle it and you don’t run into the limit when this when the software improves and can handle it. Obviously we’re from the latter school of thought. As I said before we've got a bunch of performance increases, performance enhancements, in the pipeline. If we wait till May to increase the block size limit to 128 MB then those performance enhancements will go in, but we won't be able to actually demonstrate it on mainnet. As for the infinitive block attack itself, I mean there are a number of mitigations that you can put in place. I mean firstly, you know, going down to a bit of the tech detail - when you send a block message or send any peer to peer message there's a header which has the size of the message. If someone says they're sending you a 30MB message and you're receiving it and it gets to 33MB then obviously you know something's wrong so you can drop the connection. If someone sends you a message that's 129 MB and you know the block size limit is 128 you know it’s kind of pointless to download that message. So I mean these are just some of the mitigations that you can put in place. When I say the attack is bullshit, I mean I mean it is bullshit from the sense that it's really quite trivial to prevent it from happening. I think there is a bit of a school of thought in the Bitcoin world that if it's not in the software right now then it kind of doesn't exist. I disagree with that, because there are small changes that can be made to work around problems like this. One other aspect of the infinite block attack, and let’s not call it the infinite block attack, let's just call it the large block attack - it takes a lot of time to validate that we gotten around by having parallel pipelines for blocks to come in, so you've got a block that's coming in it's got a unknown stuck on it for two hours or whatever downloading and validating it. At some point another block is going to get mined b someone else and as long as those two blocks aren't stuck in a serial pipeline then you know the problem kind of goes away.
Cory: 0:18:26.55,0:18:48.27 Are there any concerns with the propagation of those larger blocks? Because there's a lot of questions around you know what the practical size of scaling right now Bitcoin SV could do and the concerns around propagating those blocks across the whole network. Steve 0:18:45.84,0:21:37.73
Yes, there have been concerns raised about it. I think what people forget is that compact blocks and xThin exist, so if a 32MB block is not send 32MB of data in most cases, almost all cases. The concern here that I think I do find legitimate is the Great Firewall of China. Very early on in Bitcoin SV we started talking with miners on the other side of the firewall and that was one of their primary concerns. We had anecdotal reports of people who were having trouble getting a stable connection any faster than 200 kilobits per second and even with compact blocks you still need to get the transactions across the firewall. So we've done a lot of research into that - we tested our own links across the firewall, rather CoinGeeks links across the firewall as they’ve given us access to some of their servers so that we can play around, and we were able to get sustained rates of 50 to 90 megabits per second which pushes that problem quite a long way down the road into the future. I don't know the maths off the top of my head, but the size of the blocks that can sustain is pretty large. So we're looking at a couple of options - it may well be the chattiness of the peer-to-peer protocol causes some of these issues with the Great Firewall, so we have someone building a bridge concept/tool where you basically just have one kind of TX vacuum on either side of the firewall that collects them all up and sends them off every one or two seconds as a single big chunk to eliminate some of that chattiness. The other is we're looking at building a multiplexer that will sit and send stuff up to the peer-to-peer network on one side and send it over splitters, to send it over multiple links, reassemble it on the other side so we can sort of transition the great Firewall without too much trouble, but I mean getting back to the core of your question - yes there is a theoretical limit to block size propagation time and that's kind of where Moore's Law comes in. Putting faster links and you kick that can further down the road and you just keep on putting in faster links. I don't think 128 main blocks are going to be an issue though with the speed of the internet that we have nowadays.
Connor: 0:21:34.99,0:22:17.84 One of the other changes that you guys are introducing is increasing the max script size so I think right now it’s going from 201 to 500 [opcodes]. So I guess a few of the questions we got was I guess #1 like why not uncap it entirely - I think you guys said you ran into some concerns while testing that - and then #2 also specifically we had a question about how certain are you that there are no remaining n squared bugs or vulnerabilities left in script execution? Steve: 0:22:15.50,0:25:36.79
It's interesting the decision - we were initially planning on removing that cap altogether and the next cap that comes into play after that (next effective cap is a 10,000 byte limit on the size of the script). We took a more conservative route and decided to wind that back to 500 - it's interesting that we got some criticism for that when the primary criticism I think that was leveled against us was it’s dangerous to increase that limit to unlimited. We did that because we’re being conservative. We did some research into these log n squared bugs, sorry – attacks, that people have referred to. We identified a few of them and we had a hard think about it and thought - look if we can find this many in a short time we can fix them all (the whack-a-mole approach) but it does suggest that there may well be more unknown ones. So we thought about putting, you know, taking the whack-a-mole approach, but that doesn't really give us any certainty. We will fix all of those individually but a more global approach is to make sure that if anyone does discover one of these scripts it doesn't bring the node to a screaming halt, so the problem here is because the Bitcoin node is essentially single-threaded, if you get one of these scripts that locks up the script engine for a long time everything that's behind it in the queue has to stop and wait. So what we wanted to do, and this is something we've got an engineer actively working on right now, is once that script validation goad path is properly paralyzed (parts of it already are), then we’ll basically assign a few threads for well-known transaction templates, and a few threads for any any type of script. So if you get a few scripts that are nasty and lock up a thread for a while that's not going to stop the node from working because you've got these other kind of lanes of the highway that are exclusively reserved for well-known script templates and they'll just keep on passing through. Once you've got that in place, and I think we're in a much better position to get rid of that limit entirely because the worst that's going to happen is your non-standard script pipelines get clogged up but everything else will keep keep ticking along - there are other mitigations for this as well I mean I know you could always put a time limit on script execution if they wanted to, and that would be something that would be up to individual miners. Bitcoin SV's job I think is to provide the tools for the miners and the miners can then choose, you know, how to make use of them - if they want to set time limits on script execution then that's a choice for them.
Yeah, I'd like to point out that a node here, when it receives a transaction through the peer to peer network, it doesn't have to accept that transaction, you can reject it. If it looks suspicious to the node it can just say you know we're not going to deal with that, or if it takes more than five minutes to execute, or more than a minute even, it can just abort and discard that transaction, right. The only time we can’t do that is when it's in a block already, but then it could decide to reject the block as well. It's all possibilities there could be in the software.
Yeah, and if it's in a block already it means someone else was able to validate it so…
Cory: 0,0:26:21.21,0:26:43.60 There’s a lot of discussions about the re-enabled opcodes coming – OP_MUL, OP_INVERT, OP_LSHIFT, and OP_RSHIFT up invert op l shift and op r shift you maybe explain the significance of those op codes being re-enabled? Steve: 0:26:42.01,0:28:17.01
Well I mean one of one of the most significant things is other than two, which are minor variants of DUP and MUL, they represent almost the complete set of original op codes. I think that's not necessarily a technical issue, but it's an important milestone. MUL is one that's that I've heard some interesting comments about. People ask me why are you putting OP_MUL back in if you're planning on changing them to big number operations instead of the 32-bit limit that they're currently imposed upon. The simple answer to that question is that we currently have all of the other arithmetic operations except for OP_MUL. We’ve got add divide, subtract, modulo – it’s odd to have a script system that's got all the mathematical primitives except for multiplication. The other answer to that question is that they're useful - we've talked about a Rabin signature solution that basically replicates the function of DATASIGVERIFY. That's just one example of a use case for this - most cryptographic primitive operations require mathematical operations and bit shifts are useful for a whole ton of things. So it's really just about completing that work and completing the script engine, or rather not completing it, but putting it back the way that it was it was meant to be.
Connor 0:28:20.42,0:29:22.62 Big Num vs 32 Bit. I've seen Daniel - I think I saw you answer this on Reddit a little while ago, but the new op codes using logical shifts and Satoshi’s version use arithmetic shifts - the general question that I think a lot of people keep bringing up is, maybe in a rhetorical way but they say why not restore it back to the way Satoshi had it exactly - what are the benefits of changing it now to operate a little bit differently? Daniel: 0:29:18.75,0:31:12.15
Yeah there's two parts there - the big number one and the L shift being a logical shift instead of arithmetic. so when we re-enabled these opcodes we've looked at them carefully and have adjusted them slightly as we did in the past with OP_SPLIT. So the new LSHIFT and RSHIFT are bitwise operators. They can be used to implement arithmetic based shifts - I think I've posted a short script that did that, but we can't do it the other way around, right. You couldn't use an arithmetic shift operator to implement a bitwise one. It's because of the ordering of the bytes in the arithmetic values, so the values that represent numbers. The little endian which means they're swapped around to what many other systems - what I've considered normal - or big-endian. And if you start shifting that properly as a number then then shifting sequence in the bytes is a bit strange, so it couldn't go the other way around - you couldn't implement bitwise shift with arithmetic, so we chose to make them bitwise operators - that's what we proposed.
That was essentially a decision that was actually made in May, or rather a consequence of decisions that were made in May. So in May we reintroduced OP_AND, OP_OR, and OP_XOR, and that was also another decision to replace three different string operators with OP_SPLIT was also made. So that was not a decision that we've made unilaterally, it was a decision that was made collectively with all of the BCH developers - well not all of them were actually in all of the meetings, but they were all invited.
Another example of that is that we originally proposed OP_2DIV and OP_2MUL was it, I think, and this is a single operator that multiplies the value by two, right, but it was pointed out that that can very easily be achieved by just doing multiply by two instead of having a separate operator for it, so we scrapped those, we took them back out, because we wanted to keep the number of operators minimum yeah.
There was an appetite around for keeping the operators minimal. I mean the decision about the idea to replace OP_SUBSTR, OP_LEFT, OP_RIGHT with OP_SPLIT operator actually came from Gavin Andresen. He made a brief appearance in the Telegram workgroups while we were working out what to do with May opcodes and obviously Gavin's word kind of carries a lot of weight and we listen to him. But because we had chosen to implement the May opcodes (the bitwise opcodes) and treat the data as big-endian data streams (well, sorry big-endian not really applicable just plain data strings) it would have been completely inconsistent to implement LSHIFT and RSHIFT as integer operators because then you would have had a set of bitwise operators that operated on two different kinds of data, which would have just been nonsensical and very difficult for anyone to work with, so yeah. I mean it's a bit like P2SH - it wasn't a part of the original Satoshi protocol that once some things are done they're done and you know if you want to want to make forward progress you've got to work within that that framework that exists.
When we get to the big number ones then it gets really complicated, you know, number implementations because then you can't change the behavior of the existing opcodes, and I don't mean OP_MUL, I mean the other ones that have been there for a while. You can't suddenly make them big number ones without seriously looking at what scripts there might be out there and the impact of that change on those existing scripts, right. The other the other point is you don't know what scripts are out there because of P2SH - there could be scripts that you don't know the content of and you don't know what effect changing the behavior of these operators would mean. The big number thing is tricky, so another option might be, yeah, I don't know what the options for though it needs some serious thought.
That’s something we've reached out to the other implementation teams about - actually really would like their input on the best ways to go about restoring big number operations. It has to be done extremely carefully and I don't know if we'll get there by May next year, or when, but we’re certainly willing to put a lot of resources into it and we're more than happy to work with BU or XT or whoever wants to work with us on getting that done and getting it done safely.
Connor: 0:35:19.30,0:35:57.49 Kind of along this similar vein, you know, Bitcoin Core introduced this concept of standard scripts, right - standard and non-standard scripts. I had pretty interesting conversation with Clemens Ley about use cases for “non-standard scripts” as they're called. I know at least one developer on Bitcoin ABC is very hesitant, or kind of pushed back on him about doing that and so what are your thoughts about non-standard scripts and the entirety of like an IsStandard check? Steve: 0:35:58.31,0:37:35.73
I’d actually like to repurpose the concept. I think I mentioned before multi-threaded script validation and having some dedicated well-known script templates - when you say the word well-known script template there’s already a check in Bitcoin that kind of tells you if it's well-known or not and that's IsStandard. I'm generally in favor of getting rid of the notion of standard transactions, but it's actually a decision for miners, and it's really more of a behavioral change than it is a technical change. There's a whole bunch of configuration options that miners can set that affect what they do what they consider to be standard and not standard, but the reality is not too many miners are using those configuration options. So I mean standard transactions as a concept is meaningful to an arbitrary degree I suppose, but yeah I would like to make it easier for people to get non-standard scripts into Bitcoin so that they can experiment, and from discussions of I’ve had with CoinGeek they’re quite keen on making their miners accept, you know, at least initially a wider variety of transactions eventually.
So I think IsStandard will remain important within the implementation itself for efficiency purposes, right - you want to streamline base use case of cash payments through them and prioritizing. That's where it will remain important but on the interfaces from the node to the rest of the network, yeah I could easily see it being removed.
Cory: 0,0:38:06.24,0:38:35.46 *Connor mentioned that there's some people that disagree with Bitcoin SV and what they're doing - a lot of questions around, you know, why November? Why implement these changes in November - they think that maybe the six-month delay might not cause a split. Well, first off what do you think about the ideas of a potential split and I guess what is the urgency for November? Steve: 0:38:33.30,0:40:42.42
Well in November there's going to be a divergence of consensus rules regardless of whether we implement these new op codes or not. Bitcoin ABC released their spec for the November Hard fork change I think on August 16th or 17th something like that and their client as well and it included CTOR and it included DSV. Now for the miners that commissioned the SV project, CTOR and DSV are controversial changes and once they're in they're in. They can't be reversed - I mean CTOR maybe you could reverse it at a later date, but DSV once someone's put a P2SH transaction into the project or even a non P2SH transaction in the blockchain using that opcode it's irreversible. So it's interesting that some people refer to the Bitcoin SV project as causing a split - we're not proposing to do anything that anyone disagrees with - there might be some contention about changing the opcode limit but what we're doing, I mean Bitcoin ABC already published their spec for May and it is our spec for the new opcodes, so in terms of urgency - should we wait? Well the fact is that we can't - come November you know it's bit like Segwit - once Segwit was in, yes you arguably could get it out by spending everyone's anyone can spend transactions but in reality it's never going to be that easy and it's going to cause a lot of economic disruption, so yeah that's it. We're putting out changes in because it's not gonna make a difference either way in terms of whether there's going to be a divergence of consensus rules - there's going to be a divergence whether whatever our changes are. Our changes are not controversial at all.
If we didn't include these changes in the November upgrade we'd be pushing ahead with a no-change, right, but the November upgrade is there so we should use it while we can. Adding these non-controversial changes to it.
Connor: 0:41:01.55,0:41:35.61 Can you talk about DATASIGVERIFY? What are your concerns with it? The general concept that's been kind of floated around because of Ryan Charles is the idea that it's a subsidy, right - that it takes a whole megabyte and kind of crunches that down and the computation time stays the same but maybe the cost is lesser - do you kind of share his view on that or what are your concerns with it? Daniel: 0:41:34.01,0:43:38.41
Can I say one or two things about this – there’s different ways to look at that, right. I'm an engineer - my specialization is software, so the economics of it I hear different opinions. I trust some more than others but I am NOT an economist. I kind of agree with the ones with my limited expertise on that it's a subsidy it looks very much like it to me, but yeah that's not my area. What I can talk about is the software - so adding DSV adds really quite a lot of complexity to the code right, and it's a big change to add that. And what are we going to do - every time someone comes up with an idea we’re going to add a new opcode? How many opcodes are we going to add? I saw reports that Jihan was talking about hundreds of opcodes or something like that and it's like how big is this client going to become - how big is this node - is it going to have to handle every kind of weird opcode that that's out there? The software is just going to get unmanageable and DSV - that was my main consideration at the beginning was the, you know, if you can implement it in script you should do it, because that way it keeps the node software simple, it keeps it stable, and you know it's easier to test that it works properly and correctly. It's almost like adding (?) code from a microprocessor you know why would you do that if you can if you can implement it already in the script that is there.
It’s actually an interesting inconsistency because when we were talking about adding the opcodes in May, the philosophy that seemed to drive the decisions that we were able to form a consensus around was to simplify and keep the opcodes as minimal as possible (ie where you could replicate a function by using a couple of primitive opcodes in combination, that was preferable to adding a new opcode that replaced) OP_SUBSTR is an interesting example - it's a combination of SPLIT, and SWAP and DROP opcodes to achieve it. So at really primitive script level we've got this philosophy of let's keep it minimal and at this sort of (?) philosophy it’s all let's just add a new opcode for every primitive function and Daniel's right - it's a question of opening the floodgates. Where does it end? If we're just going to go down this road, it almost opens up the argument why have a scripting language at all? Why not just add a hard code all of these functions in one at a time? You know, pay to public key hash is a well-known construct (?) and not bother executing a script at all but once we've done that we take away with all of the flexibility for people to innovate, so it's a philosophical difference, I think, but I think it's one where the position of keeping it simple does make sense. All of the primitives are there to do what people need to do. The things that people don't feel like they can't do are because of the limits that exist. If we had no opcode limit at all, if you could make a gigabyte transaction so a gigabyte script, then you can do any kind of crypto that you wanted even with 32-bit integer operations, Once you get rid of the 32-bit limit of course, a lot of those a lot of those scripts come up a lot smaller, so a Rabin signature script shrinks from 100MB to a couple hundred bytes.
I lost a good six months of my life diving into script, right. Once you start getting into the language and what it can do, it is really pretty impressive how much you can achieve within script. Bitcoin was designed, was released originally, with script. I mean it didn't have to be – it could just be instead of having a transaction with script you could have accounts and you could say trust, you know, so many BTC from this public key to this one - but that's not the way it was done. It was done using script, and script provides so many capabilities if you start exploring it properly. If you start really digging into what it can do, yeah, it's really amazing what you can do with script. I'm really looking forward to seeing some some very interesting applications from that. I mean it was Awemany his zero-conf script was really interesting, right. I mean it relies on DSV which is a problem (and some other things that I don't like about it), but him diving in and using script to solve this problem was really cool, it was really good to see that.
I asked a question to a couple of people in our research team that have been working on the Rabin signature stuff this morning actually and I wasn't sure where they are up to with this, but they're actually working on a proof of concept (which I believe is pretty close to done) which is a Rabin signature script - it will use smaller signatures so that it can fit within the current limits, but it will be, you know, effectively the same algorithm (as DSV) so I can't give you an exact date on when that will happen, but it looks like we'll have a Rabin signature in the blockchain soon (a mini-Rabin signature).
Cory: 0:48:13.61,0:48:57.63 Based on your responses I think I kinda already know the answer to this question, but there's a lot of questions about ending experimentation on Bitcoin. I was gonna kind of turn that into – with the plan that Bitcoin SV is on do you guys see like a potential one final release, you know that there's gonna be no new opcodes ever released (like maybe five years down the road we just solidify the base protocol and move forward with that) or are you guys more on the idea of being open-ended with appropriate testing that we can introduce new opcodes under appropriate testing. Steve: 0:48:55.80,0:49:47.43
I think you've got a factor in what I said before about the philosophical differences. I think new functionality can be introduced just fine. Having said that - yes there is a place for new opcodes but it's probably a limited place and in my opinion the cryptographic primitive functions for example CHECKSIG uses ECDSA with a specific elliptic curve, hash 256 uses SHA256 - at some point in the future those are going to no longer be as secure as we would like them to be and we'll replace them with different hash functions, verification functions, at some point, but I think that's a long way down the track.
I'd like to see more data too. I'd like to see evidence that these things are needed, and the way I could imagine that happening is that, you know, that with the full scripting language some solution is implemented and we discover that this is really useful, and over a period of, like, you know measured in years not days, we find a lot of transactions are using this feature, then maybe, you know, maybe we should look at introducing an opcode to optimize it, but optimizing before we even know if it's going to be useful, yeah, that's the wrong approach.
I think that optimization is actually going to become an economic decision for the miners. From the miner’s point of view is if it'll make more sense for them to be able to optimize a particular process - does it reduce costs for them such that they can offer a better service to everyone else? Yeah, so ultimately these decisions are going to be miner’s main decisions, not developer decisions. Developers of course can offer their input - I wouldn't expect every miner to be an expert on script, but as we're already seeing miners are actually starting to employ their own developers. I’m not just talking about us - there are other miners in China that I know have got some really bright people on their staff that question and challenge all of the changes - study them and produce their own reports. We've been lucky with actually being able to talk to some of those people and have some really fascinating technical discussions with them.
For the last weeks I've read a lot of BCH drama on this reddit and now I think that BCH will not survive Novembers's fork because in all given scenarios the BCH blockchain will split into 2 or more blockchains: * BCH ABC - will be supported only by nodes using the latest ABC client, * BCH X - will be supported by all nodes using the BU and XT clients, * BCH SV - will be supported only by nodes using the latest SV client. The real issue with the November's fork is not that CTOR is added to the blocks but that the new ABC client will not longer support TTOR (the current sort order) which means that the ABC client will no longer be compatible with BU, XT or even SV clients. In other words the CTOR dispute is just a smokescreen while the real prize is which one from the new blockchains will manage to keep the BCH moniker and all the advantages resulting from it (existing user base, exchanges support and so on). Regardless of foul play from any actors the possible outcomes are (ordered by likelihood): Case 1: BCH ABC wins. That will result in a blockchain where Bitmain will easily control more than 70% of the hashrate and Bitcoin ABC will become BCH's de facto Blockstream. Case 2: BCH X wins. The value of BCH will plunge because Bitmain/ABC will fork anyway and there is a big chance that the chain will end up like Bitcoin Gold (51% attack at some point). Case 3. BCH SV wins. There is a small probability for this to happen because the number of SV nodes is insignificant. It looks like this is possible considering that Coingeek and BMG/Nchain stand for 40%+ of BCH hashrate atm. Case 4. Nobody wins. Most exchanges will delist BCH and each blockchain will have to grow from scratch. This might happen anyway on some exchanges. In the end, it really doesn't matter who wins because the resulting blockchain will be weaker than the current blockchain. That's like certain. Now, please tell me why I'm wrong and everything will be alright.
Cobra-Bitcoin commented on 26 Dec 2015 Coinbase is now running Bitcoin XT in their production servers. XT is an contentious hard fork attempt that will create a new altcoin and split the community and blockchain should it ever go into effect. If this ever happens, Coinbase's customers may find that they no longer own any actual Bitcoin. ... Bitcoin Core has already announced a road map to address scalability concerns. I don't see why @barmstrong feels the need to try to promote XT in this way. Almost all of the Bitcoin technical community supports the announced road map. It's not like things aren't moving forward.
My new years resolution was to avoid the block size debate. Oh well.
Regardless what you think of block size debate. Mike Hearn and his tactics have done nothing but pollute the well. He has been intransigent in development, his proposals were routinely shut down because they were horrible on a technical level, he had many dangerous ideas which undermined the very concept of Bitcoin. When he made no traction with his idea, he created his own Bitcoin client. Because that wasn’t enough he began politicking users to switch to his client. When users found no technical benefit or improvement over his client and failed to adopt it, that wasn’t enough. He then found a tool which he could use to drive a wedge into the community, and drive adoption to his client or so he thought, the block size. Rather than offer constructive support to this issue, he recruited Gavin and others to create this imaginary war that did not exist, of a Bitcoin mafia vs the people. Coincidentally, it was during this time that he became concerned with the block size that the spam attacks on the network beginning happening. Heckuva coincidence. Mike, in his dramatic style, made the proclamation that bitcoin is forking and with that he introduced a new level of politics to Bitcoin beyond what it had ever experienced. In addition, Mike has managed to make the scaling issue, a complex technical issue, into something that is a political litmus test, like abortion, or gun control, something that was sorely missing in a technical community. As if merely raising the block limit would fix Bitcoin’s problem in his eyes. Mike is smart enough to know that wasn’t the case, but it didn’t matter because by turning something complex into something simple, he was able to sell the story, to people who weren’t interested in the technical nuances, but wanted something to hang their beliefs on. His ideas were perfect fodder to those adopters who were suffering in Bitcoin’s long bear market. Mike had the answer, bitcoin is suffering because the devs refuse to scale it. Mike Hearn had a simple solution, just change a number, it's so easy, any lay person can see that. Clearly if you opposed such a simple change you had ulterior motives, the people in opposition where trying to steal bitcoin. Mike worked in front and behind the scenes like a dogged politician to create this imaginary timetable that bitcoin was about to explode and collapse, watch the price he said. Creating sensationalist media posts that were technically flawed, and painted bookworm engineers, who have done nothing but work in the best interest of Bitcoin, as scheming backroom politicians who were co-opting bitcoin for their meglo-maniacal ambitions (project much Mike?) When the time table for Bitcoin’s, life or death decision came and went, and bitcoin did not collapse, and no one adopted his client, he cried foul. Ironically, Mike laments that Bitcoin is the hands of 10 developers (not true) yet he had appointed himself the benevolent dictator of his coin Bitcoin-XT. Why was the Bitcoin community not falling on its sword and adopting his plans? It’s probably because again, the secret cabal was conspiring to suppress his ideas. His ideas which were spammed over social media, which got thousands of page views, which generated hundreds if not thousands of hours of discussion. Despite all this, no-one knew that Mike had this beautiful solution to rescue bitcoin, all because this evil mafia conspired to deny freedom to Bitcoiners around the world and keep his ideas a secret. Meanwhile actual developers are moving forward with proposals that will scale bitcoin but Mike says that isn’t enough. Despite his incessant nagging, driving XTers to brigade every bitcoin discussion, ruining technical discussions in the development mailing list. That still wasn’t enough. Nope, unless Mike Hearn got his way, Bitcoin is a failed experiment. Mike Hearn’s goal was never about Bitcoin, it was about wrecking Bitcoin. There is no way you can reconcile his tactics with someone who put Bitcoin first. No one had previously proposed forking Bitcoin to their own client, without calling it an altcoin or alternate implementation. Even Garzik was clear about it in discussion with Satoshi himself. Mike was the first to introduce the idea that creating your own implementation of Bitcoin, which is not compatible with other implementations, and changes a core function of bitcoin with something as low as 75% approval, was not an alt version of bitcoin but was still Bitcoin. Amazing, He has attempted to change what was an accepted and understand aspect of Bitcoin. Now forking Bitcoin is a grand idea, Bitcoin forks for everyone. Forking Bitcoin is not something new, it has been around since day one, and the community had agreed that a fork of Bitcoin, without unanimous consensus, was an Altcoin. People who care about bitcoin do not promote Altcoins because it’s clear this would fracture Bitcoin and undermine the very method in which bitcoin secures itself. But in Mike’s world people should undermine their investment in order to get a better investment. Bitcoin has shown that the economic consensus mechanism works, that the consensus will respect the protocol. That if the time to change the protocol comes, it will be a change that is readily apparent, and will be adopted unanimously (Mike, that means without opposition.) Because from an economic interest it makes no sense to undermine bitcoin by fracturing it. And so surprise, suprise, bitcoin participants are making rational economic decisions. Bitcoin is not a democracy where 51% rules. In fact that is Bitcoin in a state of attack. Still the very fact that Bitcoin has continued to function, and even begun to rise again precisely when it was supposed to be collapsing, was a slap in the face to Mike Hearn. So the final stroke a front page NYT , “you can’t fire me I quit ” announcement with a dramatic companion post all over social media, as If Bitcoin has lost some key intellectual power. Sorry Mike, but Bitcoin is not yours to fail. The best thing that could happen to Bitcoin and Mike Hearn is a final divorce, though messy it has been. But has Mike really left us? Something tells me he hasn’t. Instead Mike’s new job will be to further the new meme that Bitcoin is a failed experiment and he should know because he was a “lead” developer who worked directly with Satoshi. And for proof, this morning I watched the Brookings Institute Webstream of their conference Beyond the Blockchain. The very first thing thing Charley Cooper of R3 CEV (Mike’s new employer) brought up was how Bitcoin was failed according to Mike Hearn, how Bitcoin was not going anywhere and that the future was private block chains. So let’s congratulate Mike on his new role, where he will work to undermine bitcoin at every turn in front of regulators, banks, VCs and the public. At least now Mike can stop pretending what his real goal has been all along. Because this has never been about raising the block limit or about or about a technical issue. This has been about co-opting Bitcoin into either Mike Hearn’s coin, or something more nefarious, on behalf of greater powers. Mike, Good Luck, Stay Strong; I wish you the best. source: http://bitledger.info/on-mike-hearn-block-size-and-bitcoins-future/
Thank you for being a part of the ColossusXT Reddit AMA! Below we will summarize the questions and answers. The team responded to 78 questions! If you question was not included, it may have been answered in a previous question. The ColossusXT team will do a Reddit AMA at the end of every quarter. The winner of the Q2 AMA Contest is: Shenbatu Q: Why does your blockchain exist and what makes it unique? A: ColossusXT exists to provide an energy efficient method of supercomputing. ColossusXT is unique in many ways. Some coins have 1 layer of privacy. ColossusXT and the Colossus Grid will utilize 2 layers of privacy through Obfuscation Zerocoin Protocol, and I2P and these will protect users of the Colossus Grid as they utilize grid resources. There are also Masternodes and Proof of Stake which both can contribute to reducing 51% attacks, along with instant transactions and zero-fee transactions. This protection is paramount as ColossusXT evolves into the Colossus Grid. Grid Computing will have a pivotal role throughout the world, and what this means is that users will begin to experience the Internet as a seamless computational universe. Software applications, databases, sensors, video and audio streams-all will be reborn as services that live in cyberspace, assembling and reassembling themselves on the fly to meet the tasks at hand. Once plugged into the grid, a desktop machine will draw computational horsepower from all the other computers on the grid. Q: What is the Colossus Grid? A: ColossusXT is an anonymous blockchain through obfuscation, Zerocoin Protocol, along with utilization of I2P. These features will protect end user privacy as ColossusXT evolves into the Colossus Grid. The Colossus Grid will connect devices in a peer-to-peer network enabling users and applications to rent the cycles and storage of other users’ machines. This marketplace of computing power and storage will exclusively run on COLX currency. These resources will be used to complete tasks requiring any amount of computation time and capacity, or allow end users to store data anonymously across the COLX decentralized network. Today, such resources are supplied by entities such as centralized cloud providers which are constrained by closed networks, proprietary payment systems, and hard-coded provisioning operations. Any user ranging from a single PC owner to a large data center can share resources through Colossus Grid and get paid in COLX for their contributions. Renters of computing power or storage space, on the other hand, may do so at low prices compared to the usual market prices because they are only using resources that already exist. Q: When will zerocoin be fully integrated? A: Beta has been released for community testing on Test-Net. As soon as all the developers consider the code ready for Main-Net, it will be released. Testing of the code on a larger test network network will ensure a smooth transition. Q: Is the end goal for the Colossus Grid to act as a decentralized cloud service, a resource pool for COLX users, or something else? A: Colossus Grid will act as a grid computing resource pool for any user running a COLX node. How and why we apply the grid to solve world problems will be an ever evolving story. Q: What do you think the marketing role in colx.? When ll be the inwallet shared nodes available...i know its been stated in roadmap but as u dont follow roadmap and offer everything in advance...i hope shared MN's to be avilable soon. A: The ColossusXT (COLX) roadmap is a fluid design philosophy. As the project evolves, and our community grows. Our goal is to deliver a working product to the market while at the same time adding useful features for the community to thrive on, perhaps the Colossus Grid and Shared Masternodes will be available both by the end of Q4 2018. Q: When will your github be open to the public? A: The GitHub has been open to the public for a few months now. You can view the GitHub here: https://github.com/ColossusCoinXT The latest commits here: https://github.com/ColossusCoinXT/ColossusCoinXT/commits/master Q: Why should I use COLX instead of Monero? A: ColossusXT offers Proof of Stake and Masternodes both which contribute layers in protection from 51% attacks often attributed with Proof of Work consensus, and in being Proof of Work(Monero) ColossusXT is environmentally friendly compared to Proof of Work (Monero). You can generate passive income from Proof of Stake, and Masternodes. Along with helping secure the network.What really sets ColossusXT apart from Monero, and many other privacy projects being worked on right now, is the Colossus Grid. Once plugged into the Colossus Grid, a desktop machine will draw computational horsepower from all the other computers on the grid. Blockchain, was built on the core value of decentralization and ColossusXT adhere to these standards with end-user privacy in mind in the technology sector. Q: With so many coins out with little to no purpose let alone a definitive use case, how will COLX distinguish itself from the crowd? A: You are right, there are thousands of other coins. Many have no purpose, and we will see others “pumping” from day to day. It is the nature of markets, and crypto as groups move from coin to coin to make a quick profit. As blockchain regulations and information is made more easily digestible projects like ColossusXT will rise. Our goal is to produce a quality product that will be used globally to solve technical problems, in doing so grid computing on the ColossusXT network could create markets of its own within utilizing Super-computing resources. ColossusXT is more than just a currency, and our steadfast approach to producing technical accomplishments will not go unnoticed. Q: Tell the crowd something about the I2P integration plan in the roadmap? 🙂 A: ColossusXT will be moving up the I2P network layer in the roadmap to meet a quicker development pace of the Colossus Grid. The I2P layer will serve as an abstraction layer further obfuscating the users of ColossusXT (COLX) nodes. Abstraction layer allows two parties to communicate in an anonymous manner. This network is optimised for anonymous file-sharing. Q: What kind of protocols, if any, are being considered to prevent or punish misuse of Colossus Grid resources by bad actors, such as participation in a botnet/denial of service attack or the storage of stolen information across the Grid? A: What defines bad actors? ColossusXT plans on marketing to governments and cyber security companies globally. Entities and individuals who will certainly want their privacy protected. There is a grey area between good and bad, and that is something we can certainly explore as a community. Did you have any ideas to contribute to this evolving variable?What we mean when we say marketing towards security companies and governments is being utilized for some of the projects and innovating new ways of grid computing. Security: https://wiki.ncsa.illinois.edu/display/cybersec/Projects+and+Software Governments: https://www.techwalla.com/articles/what-are-the-uses-of-a-supercomputer Q: The Colossus Grid is well defined but I don't feel easily digestible. Has their been any talk of developing an easier to understand marketing plan to help broaden the investoadoptor base? A: As we get closer to the release of the Colossus Grid marketing increase for the Colossus Grid. It will have a user friendly UI, and we will provide Guides and FAQ’s with the release that any user intending to share computing power will be able to comprehend. Q: Can you compare CollossusXT and Golem? A: Yes. The Colosssus Grid is similar to other grid computing projects. The difference is that ColossusXT is on it’s own blockchain, and does not rely on the speed or congestion of a 3rd party blockchain. The Colossus Grid has a privacy focus and will market to companies, and individuals who would like to be more discreet when buying or selling resources by offering multiple levels of privacy protections. Q: How do you guys want to achieve to be one of the leaders as a privacy coin? A: Being a privacy coin leader is not our end game. Privacy features are just a small portion of our framework. The Colossus Grid will include privacy features, but a decentralized Supercomputer is what will set us apart and we intend to be leading this industry in the coming years as our vision, and development continue to grow and scale with technology. Q: With multiple coins within this space, data storage and privacy, how do you plan to differentiate COLX from the rest? Any further partnerships planned? A: The Colossus Grid will differentiate ColossusXT from coins within the privacy space. The ColossusXT blockchain will differentiate us from the DATA storage space. Combining these two features with the ability to buy and sell computing power to complete different computational tasks through a decentralized marketplace. We intend to involve more businesses and individuals within the community and will invite many companies to join in connecting the grid to utilize shared resources and reduce energy waste globally when the BETA is available. Q: Has colossus grid had the best come up out of all crypto coins? A: Possibly. ColossusXT will continue to “come up” as we approach the launch of the Colossus Grid network. Q: How far have Colossus gone in the ATM integration A: ColossusXT intends to and will play an important role in the mass adoption of cryptocurrencies. We already have an ongoing partnership with PolisPay which will enable use of COLX via master debit cards. Along with this established relationship, ColossusXT team is in touch with possible companies to use colx widely where these can only be disclosed upon mutual agreement. Q: How does COLX intend to disrupt the computing industry through Grid Computing? A: Using the Colossus Grid on the ColossusXT blockchain, strengthens the network. Computers sit idly by for huge portions of the day. Connecting to the Colossus Grid and contributing those idle resources can make use of all the computing power going to waste, and assist in advancing multiple technology sectors and solving issues. Reducing costs, waste, and increased speed in technology sectors such as scientific research, machine learning, cyber security, and making it possible for anyone with a desktop PC to contribute resources to the Colossus Grid and earn passive income. Q: What kind of partnerships do you have planned and can you share any of them? :) A: The ColossusXT team will announce partnerships when they are available. It’s important to finalize all information and create strong avenues of communication between partners ColossusXT works with in the future. We are currently speaking with many different exchanges, merchants, and discussing options within our technology sector for utilizing the Colossus Grid. Q: Will shared Masternodes be offered by the COLX team? Or will there be any partnerships with something like StakingLab, StakeUnited, or SimplePosPool? StakingLab allows investors of any size to join their shared Masternodes, so any investor of any size can join. Is this a possibility in the future? A: ColossusXT has already partnered with StakingLab. We also plan to implement shared Masternodes in the desktop wallet. Q: How innovative is the Colossus Grid in the privacy coin space? A: Most privacy coins are focused on being just a currency / form of payment. No other project is attempting to do what we are doing with a focus on user privacy. Q: Hey guys do you think to integrated with some other plataforms like Bancor? I would like it! A: ColossusXT is in touch with many exchange platforms, however, due to non disclosure agreements details cannot be shared until it is mutually decided with the partners. We will always be looking for new platforms to spread the use of colx in different parts of the world and crypto space. Q: What is the reward system for the master node owners? A: From block 388.800 onwards, block reward is 1200 colx and this is split based on masternode ownestaker ratio. This split is based on see-saw algorithm. With an increasing number of masternodes the see-saw algorithm disincentivizes the establishment of even more masternodes because it lowers their profitability. To be precise, as soon as more than 41.5% of the total COLX coin supply is locked in masternodes, more than 50% of the block reward will be distributed to regular staking nodes. As long as the amount of locked collateral funds is below the threshold of 41.5%, the see-saw algorithm ensure that running a masternode is financially more attractive than running a simple staking node, to compensate for the additional effort that a masternode requires in comparison to a simple staking node.Please refer to our whitepaper for more information. Q: What other marketplaces has the COLX team been in contact with? Thanks guys! Love the coin and staff A: ColossusXT gets in touch for different platforms based on community request and also based on partnership requests received upon ColossusXT business team’s mutual agreement. Unfortunately, these possibilities cannot be shared until they are mutually agreed between the partners and ColossusXT team due to non disclosure agreements. Q:What do you think about the new rules that will soon govern crypto interactions in the EU?they are against anonymous payments A: Blockchain technology is just now starting to become clear to different governments. ColossusXT's privacy features protect the end-user from oversharing personal information. As you are probably aware from the multiple emails you've received recently from many websites. Privacy policies are always being updated and expanded upon. The use of privacy features with utility coins like ColossusXT should be a regular norm throughout blockchain. This movement is part is about decentralization as much as it is about improving technology. While this news may have a role to play. I don't think it is THE role that will continuously be played as blockchain technology is implemented throughout the world. Q: Any hints on the next big feature implementation you guys are working on? According to road map - really excited to hear more about the Shared MN and the scale of the marketplace! A: Current work is focused on the privacy layer of Colossus Grid and completing the updated wallet interface. Q: Why choose COLX, or should I say why should we believe in COLX becoming what you promise in the roadmap. What are you different from all the other privacy coins with block chain establishment already in effect? A: ColossusXT is an environmentally friendly Proof of Stake, with Masternode technology that provide dual layers of protection from 51% attacks. It includes privacy features that protect the user while the utilize resources from the Colossus Grid. Some of the previous questions within this AMA may also answer this question. Q: What tradeoffs do you have using the Colossus Grid versus the more typical distribution? A: The advantage of supercomputers is that since data can move between processors rapidly, all of the processors can work together on the same tasks. Supercomputers are suited for highly-complex, real-time applications and simulations. However, supercomputers are very expensive to build and maintain, as they consist of a large array of top-of-the-line processors, fast memory, custom hardware, and expensive cooling systems. They also do not scale well, since their complexity makes it difficult to easily add more processors to such a precisely designed and finely tuned system.By contrast, the advantage of distributed systems (Like Colossus Grid) is that relative to supercomputers they are much less expensive. Many distributed systems make use of cheap, off-the-shelf computers for processors and memory, which only require minimal cooling costs. In addition, they are simpler to scale, as adding an additional processor to the system often consists of little more than connecting it to the network. However, unlike supercomputers, which send data short distances via sophisticated and highly optimized connections, distributed systems must move data from processor to processor over slower networks making them unsuitable for many real-time applications. Q: Why should I choose Colossus instead of another 100,000 altcoins? A: Many of these alt-coins are all very different projects. ColossusXT is the only Grid computing project with a focus on user privacy. We have instant transactions, and zero-fee transactions and ColossusXT is one of the very few coins to offer live support. Check out our Whitepaper! Q: Will there be an option (in the future) to choose between an anonymous or public transaction? A: Zerocoin is an evolution of the current coin mixing feature. Both allow an individual to decide how they would like to send their transactions. Q: What exchange has highest volume for ColossusXT, and are there any plans for top exchanges soon ? A: Currently Cryptopia carries the majority of ColossusXT volume. We are speaking with many different exchanges, and preparing requested documentation for different exchanges. ColossusXT intends to be traded on every major exchange globally. Q: What is the TPS speed that colx blockchain achieves? A: ColossusXT achieves between 65-67 TPS depending on network conditions currently. Q: Plans on expanding the dev team? A: As development funds allow it, the team will be expanded. Development costs are high for a unique product like ColossusXT, and a good majority of our budget is allocated to it. Q: Can you explain what is and what are the full porpose of the COLOSSUSXT GRID PROJECT ? A: Colossus Grid is explained in the whitepaper. The uses for grid computing and storage are vast, and we are only starting to scratch the surface on what this type of computing power can do. There is also a description within the formatting context within the AMA of the Colossus Grid. Q: Is there mobile wallet for Android and iOS? If not, is there a roadmap? A: There Android wallet is out of beta and on the Google PlayStore: iOS wallet is planned for development. The roadmap can be found here: https://colossusxt.io/roadmap/ Q: Is ColossusXT planning on partnering up with other cryptocurrency projects? Such as: Bread and EQUAL. A: ColossusXT plans on partnering with other crypto projects that make sense. We look for projects that can help alleviate some of our development work / provide quality of life upgrades to our investors so that we can focus on Colossus Grid development. When absolutely love it when the community comes to us with great projects to explore. Q: Did you ever considered a coinburn? Don't you think a coin burn will increase COLX price and sustain mass adoption? Do you plan on keeping the price of COLX in a range so the potential big investors can invest in a not so much volatile project? A**:** There are no plans to do a coinburn at this time. Please check out our section in the whitepaper about the supply. Q: what is the next big exchange for colx to be listed ? A: There are several exchanges that will be listing ColossusXT soon. Stay tuned for updates within the community as some have already been announced and future announcements.
Q: How will Colx compete with other privacy coins which claim to be better like Privacy? A: ColossusXT is not competing with other privacy coins. ColossusXT will evolve into the Colossus Grid, which is built on the backbone of a privacy blockchain. In our vision, all these other privacy coins are competing for relevancy with ColossusXT. There are also similar responses to question that may hit on specifics. Q: Does COLX have a finite number of coins like bitcoin? A: No, ColossusXT is Proof of Stake. https://en.wikipedia.org/wiki/Proof-of-stake Q: What are the advantages of COLX over other competitor coins (eg. ECA)? A: The only similarities between ColossusXT and Electra is that we are both privacy blockchains. ColossusXT is very much an entirely different project that any other privacy coin in the blockchain world today. The Colossus Grid will be a huge advantage over any other privacy coin. Offering the ability for a desktop machine to rent power from others contributing to the Colossus Grid and perform and compute high level tasks. Q: How do you feel about some countries frowning upon privacy coins and how do you plan to change their minds (and what do you plan to do about it?) A: The ColossusXT team tries to view opinions from multiple perspectives so that we can understand each line of thinking. As blockchain technology becomes more widely adopted, so will the understanding of the importance of the privacy features within ColossusXT. Privacy is freedom. Q: How do you see COLX in disrupting cloud gaming services such as PlayStation Now? A: Cloud gaming services have not been discussed. Initial marketing of our private grid computing framework will be targeted at homes users, governments, and cyber security firms who may require more discretion / anonymity in their work. Q: Since colx is a privacy coin and is known for its privacy in the transactions due to which lot of money laundering and scams could take place, would colx and its community be affected due to it? And if does then how could we try to prevent it? A: ColossusXT intends to be known for the Colossus Grid. The Colossus Grid development will be moved up from Q1 2019 to Q3 2018 to reflect this message and prevent further miscommunication about what privacy means for the future of ColossusXT. Previous answers within this AMA may further elaborate on this question. Q: When do you plan to list your coin on other "bigger" exchanges? A: ColossusXT is speaking with many different exchanges. These things have many different factors. Exchanges decide on listing dates and we expect to see ColossusXT listed on larger exchanges as we approach the Colossus Grid Beta. The governance system can further assist in funding. Q: What was the rationale behind naming your coin ColossusXT? A:Colossus was a set of computers developed by British codebreakers in the years 1943–1945. XT symbolises ‘extended’ as the coin was forked from the original Cv2 coin. Q: Can you give any details about the E Commerce Marketplace, and its progress? A: The Ecommerce Marketplace is a project that will receive attention after our development pass on important privacy features for the grid. In general, our roadmap will be changing to put an emphasis on grid development. Q: How will someone access the grid, and how will you monetize using the grid? Will there be an interface that charges COLX for time on the grid or data usage? A: The Colossus Grid will be integrated within the ColossusXT wallet. Buying & Selling resources will happen within the wallet interface. You won't be able to charge for "time" on the grid, and have access to unlimited resources. The goal is to have users input what resources they need, and the price they are willing to pay. The Colossus Grid will then look for people selling resources at a value the buyer is willing to pay. Time may come into play based on which resources you are specifically asking for. Q: Are there any plans to launch an official YouTube channel with instructional videos about basic use of the wallets and features of COLX? Most people are visually set and learn much faster about wallets when actually seeing it happen before they try themselves. This might attract people to ColossusXT and also teach people about basic use of blockchain and cryptocurrency wallets. I ask this because I see a lot of users on Discord and Telegram that are still learning and are asking a lot of real basic questions. A: ColossusXT has an official YT account with instructional videos: https://www.youtube.com/channel/UCCmMLUSK4YoxKvrLoKJnzng Q: What are the usp's of colx in comparing to other privacy coins? A: Privacy coins are a dime a dozen. ColossusXT has different end goals than most privacy coins, and this cannot be stated enough. Our goal is not just to be another currency, but to build a sophisticated computing resource sharing architecture on top of the privacy blockchain. Q: A new exchange will probably gain more liquidity for our coin. If you might choose 3 exchanges to get COLX listed, what would be your top 3? A: ColossusXT intends to be listed on all major exchanges globally. :) Q: What is the future of privacy coins? What will be the future colx userbase (beyond the first adopters and enthusiasts)? A: The future of privacy is the same it has always been. Privacy is something each and everyone person owns, until they give it away to someone else. Who is in control of your privacy? You or another person or entity?The future of the ColossusXT user base will comprise of early adopters, enthusiast, computer science professionals, artificial intelligence, and computational linguistics professionals for which these users can utilize the Colossus Grid a wide range of needs. Q: Will ColossusXT join more exchanges soon?? A: Yes. :) Q: So when will Colossus put out lots of advertisement to the various social media sites to get better known? Like Youtube videos etc. A: As we get closer to a product launch of the Colossus Grid, you’ll begin to see more advertisements, YouTubers, and interviews. We’re looking to also provide some presentations at blockchain conferences in 2018, and 2019. Q: In your opinion, what are some of the issues holding COLX back from wider adoption? In that vein, what are some of the steps the team is considering to help address those issues? A: One of the main issues that is holding ColossusXT back from a wider adoption is our endgame is very different from other privacy coins. The Colossus Grid. In order to address this issue, the ColossusXT team intends to have a Colossus Grid Beta out by the end of Q4 and we will move development of the Colossus Grid from Q1 2019 to Q3 2018. Q: Or to see it from another perspective - what are some of the biggest issues with crypto-currency and how does COLX address those issues? A: Biggest issue is that cryptocurrency is seen as a means to make quick money, what project is going to get the biggest “pump” of the week, and there is not enough focus on building blockchain technologies that solve problems or creating legitimate business use cases. For the most part we believe the base of ColossusXT supporters see our end-game, and are willing to provide us with the time and support to complete our vision. The ColossusXT team keeps its head down and keeps pushing forward. Q: I know it's still early in the development phase but can you give a little insight into what to look forward to regarding In-wallet voting and proposals system for the community? How much power will the community have regarding the direction COLX development takes in the future? A: The budget and proposal system is detailed in the whitepaper. Masternode owners vote on and guide the development of ColossusXT by voting on proposals put forth by the community and business partners. Our goal is to make this process as easy and accessible as possible to our community. Q: Will there be an article explaining the significance of each partnership formed thus far? A: Yes, the ColossusXT team will announce partners on social media, and community outlets. A detailed article of what partnerships mean will be available on our Medium page: https://medium.com/@colossusxt Q: What potential output from the Grid is expected and what would it's use be? For example, x teraflops which could process y solutions to protein folding in z time. A: There are many uses for grid computing. A crypto enthusiast mining crypto, a cyber security professional cracking a password using brute force, or a scientist producing climate prediction models. The resources available to put towards grid projects will be determined by the number of nodes sharing resources, and the amount of resources an individual is willing to purchase with COLX. All individuals will not have access to infinite grid resources. Q: Is there a paper wallet available? A: Yes, see https://mycolxwallet.org Q: Is there a possibility of implementing quantum computer measures in the future? A: This is a great idea for potentially another project in the future. Currently this is not possible with the Colossus Grid. Instead of bits, which conventional computers use, a quantum computer uses quantum bits—known as qubits. In classical computing, a bit is a single piece of information that can exist in two states – 1 or 0. Quantum computing uses quantum bits, or 'qubits' instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values. Q: Do you plan to do a coin burn? A: No future coin burns are planned. Anything like this would go through a governance proposal and Masternode owners would vote on this. This is not anything we’ve seen within the community being discussed. Q: Can I check the exact number of current COLX master node and COLX staking node? A: Yes. You can view the Masternodes and the amount of ColossusXT (COLX) being staked by viewing the block explorer. Block explorer: https://chainz.cryptoid.info/colx/#!extraction Q: What incentive could we give a youtuber to do the BEST video of ColossusXT (COLX)? A: We've been approached by several YouTubers. The best thing a YouTuber can do is understand what ColossusXT is, join the community, ask questions if there is something they don't understand. The problem with many YouTubers is that some of them are just trying to get paid, they don't really care to provide context or research a project. Disclaimer: This is not all YouTubers, but many. Q: In which ways is the ColossusGrid different from other supercomputer / distributed computing projects out there. Golem comes to mind. Thanks! A: The main difference is that we are focused on the end users privacy, and the types of users that we will be targeting will be those that need more discretion / anonymity in their work. We are building framework that will continue to push the boundaries of user privacy as it relates to grid computing. Q: Can we please complete our roadmap ahead of schedule? I find most other coins that do this actually excell in terms of price and community members. Keep on top of the game :) A: The Colossus XT roadmap is a very fluid document, and it is always evolving. Some items are moved up in priority, and others are moved back. The roadmap should not be thought of something that is set in stone. Q: Does COLX have master nodes? A: Yes. ColossusXT has masternodes. Q: Have thought about providing a method to insert a form of payment in colx in any page that wants to use cryptocurrencies in a fast and simple way in order to masive adoption???? A: There is already this option.https://mycryptocheckout.com/coins/ Q: What do you think your community progress till now? A: The community has grown greatly in the last 3 months. We’re very excited to go from 13 to 100 questions in our quarterly AMA. Discord, Telegram, and Twitter are growing everyday. Q: I noticed on Roadmap: Coinomi and ahapeshift wallet integration. Can you tell me more about this? I am new in crypto and new ColX investor so I don't know much about this. Thanks and keep a good work. A: Coinomi is a universal wallet. ColossusXT will have multiple wallet platforms available to it. Shapeshift allows you to switch one crypto directly for another without the use of a coupler (BTC). Q: Is "A general-purpose decentralized marketplace" written in the whitepaper the same as "E-COMMERCE MARKETPLACE" written on the roadmap? Please tell me about "A general-purpose decentralized marketplace" or "E-COMMERCE MARKETPLACE" in detail. A: Details will be posted as we get closer to the marketplace. It will be similar to other marketplaces within blockchain. Stay tuned for more information by following us on Twitter. Q: History has shown that feature-based technologies always get replaced by technologies with platforms that incorporate those features; what is colossius big picture? A: The Colossus Grid. Which has been explained within this AMA in a few different ways. Q: What are the main objectives for COLX team this year? Provide me 5 reason why COLX will survive in a long term perspective? Do you consider masternodes working in a private easy to setup wallet on a DEX network? Already big fan, have a nice day! A: Getting into Q3 our main object is to get a working product of the Colossus Grid by the end of Q4.
Community - Our community is growing everyday as knowledge about what we’re building grows. When the Colossus Grid is online we expect expansion to grow at a rapid pace as users connect to share resources.
Team - The ColossusXT team will continue to grow. We are stewards of a great community and an amazing project. Providing a level of support currently unseen in many other projects through Discord. The team cohesion and activity within the community is a standard we intend to set within the blockchain communities.
Features - ColossusXT and The Colossus Grid will have user friendly AI. We understand the difficulties when users first enter blockchain products. The confusion between keys, sending/receiving addresses, and understanding available features within. Guides will always be published for Windows/Mac/Linux with updates so that these features can be easily understood.
Colossus Grid - The Colossus Grid answers real world problems, and provides multiple solutions while also reducing energy consumption.
Use Case - Many of the 1000+ other coins on the market don’t have the current use-case that ColossusXT has, let alone the expansion of utility use-cases in multiple sectors.
Forking Bitcoin, the first existential milestone I can understand why Gavin feels that he must do something drastic to force the issue. great articles bitcoin XT has great argument mostly the block size limit to compete with big payments processors but if the dev continue to fight bitcoin will be worth nothing is a few years someone has Bitcoin XT. Bitcoin XT is a patch set on top of Bitcoin Core, with a focus on upgrades to the peer to peer protocol. By running it you can opt in to providing the Bitcoin network with additional services beyond what Bitcoin Core provides. Currently it contains the following additional changes: Support for larger blocks. Bitcoin's scaling drama has been going on for over two years, but Bloq economist Paul Sztorc presented a solution to the debate back in 2015. the issue with these sorts of hard-forking A prohibition on forking would be tantamount to making the developers of the reference client permanent dictators of the Bitcoin protocol. While I neither support or oppose the Bitcoin XT attempted fork, the ability to fork Bitcoin is crucial to preserving the currency’s independence from anything but the demands of the market. The Meaning of ‘Hard Fork’ In this article, we explore the bitcoin hard fork and what it means when a cryptocurrency is hard forking. Ever since Caesar, the great Roman warlord, crossed the Rubicon, breaking age-old traditions and norms has increasingly become the only way to create a new world order.
Hey guys, so it's been a wild ride for Bitcoin this past week. We saw Bitcoin tank over the weekend to a low of $938 and then recover to back around $1100. Apparently this is being caused by talk ... Chris Pacia, Chris Karabats and I watch the bitcoin cash fork live, discuss the issues that led up to the fork, and what will happen over the next few weeks. Tomorrow (Friday) we do a follow up ... Why do we need another Bitcoin fork, IBTC Fork in particular? In 2009, Bitcoin changed the way people should think about money. Bitcoin is like a scripture written in code. When it was written, it ... In this video, we will explore the bitcoin hard fork to Bitcoin Cash and what it means when a cryptocurrency is hard forking. Distributed consensus is a concept that makes blockchain technology at ... Bitcoin Q&A: What happens to our bitcoins during a hard fork? - Duration: 5:04. aantonop 75,157 views. 5:04. ... Bitcoin & Litecoin's Big Issue [LAN] - Duration: 10:38.