Metcalfe’s Law and Bitcoin’s Value - Coinmotion Safe and ...

It took 8 years to reach a 55 Billion took the last 3 months to reach 110 Billion Cap.....It did as much in the last 3 months as it did in 8 years?!?

Am I right? (Not trying to go overboard but this could get crazier than it already is.
Is a 200 billion cap obtainable say my February or March?
submitted by S0XXX to Bitcoin [link] [comments]

How is it possible that the number of Ethereum addresses are nearing 4 Million while Bitcoin addresses amount to only about 500,000?

Ethereum - Bitcoin -
submitted by intuitecon to ethereum [link] [comments]

Bitcoin longterm price chart

Hi, this is my first post here on reddit. I've been in bitcoin since 2013, haven't posted a lot though. I am however quite active on tradingview, and would like to share my view on bitcoin longterm price growth, so that people might learn how the BTC cycles work, and that the fundamental bitcoin price drivers are: 1. The halvings
  1. Metcalfe's law and network growth

That's why I think we've almost reached the low in the current bearmarket. It sure could go as low as 1500-2500, but that's just random noise in the long run. Anyways, here's one of my charts from 7 months ago, which seems to be quite accurate so far:
Price growth follows a square-root function in the logarithmic chart. Longterm target 2030+ : 1million USD per BTC ;)
There will be many more bull- and bearmarkets. It's normal and part of the growing process.
submitted by ScifiInstinct to Bitcoin [link] [comments]

Bitcoin vs Metcalfe's Law

Bitcoin vs Metcalfe's Law submitted by tacotacoman1 to Bitcoin [link] [comments]

The Case for an Extreme ETH Mispricing

The format of this post has been modified to be more reddit friendly. Apologies for any momentum lost.
This piece was written in collaboration with u/beerchicken8. He deserves a massive amount of credit and our thought experiment could not have been generated without him.
We wrote this piece to remind the community and new investors that we are incredibly early to this investment, and also to demonstrate that ETH is massively undervalued even if viewed as a network utility token. We meant for this to be as simple, yet impactful as possible. We are not in the practice of writing academic papers, but the narrative is clearly demonstrated.
all data is accurate to May 22, 2017
A Crude Valuation of ETH
Pundits and the media will look at the recent price graph and will likely tell you that cryptocurrencies are in a bubble. Sure the recent price action looks aggressive and may appear unsustainable, but it is hardly a bubble. In fact, it is likely that ETH is significantly undervalued.
ETH Price Graph
Crypto skeptics attempt to value bitcoin or ETH using conventional stock market metrics like P/E ratio or by comparing market capitalizations of crypto versus blue chip companies. These metrics do not fairly translate to cryptocurrencies. We can improve on that.
Metcalfe's Law Image Description
A close friend of mine stumbled across Metcalfe’s Law in an effort to properly value the market price of ETH, the cryptocurrency of ethereum. We can think of ETH as a demand-driven digital asset, since it is converted to gas to execute the smart contracts on the blockchain. It provides a vital network function: incentivizing miners to secure the blockchain. Therefore we should attempt to value ETH by attempting to value the ethereum network itself. We can use the daily transactions as our tool.
Metcalfe’s Law aims to value the network effects of communication technologies like the Internet or social networking. The premise is that the value of a telecommunications network is proportional to the square of the number of connected users of the system.
To determine a fair market price of ETH, we can compare the ethereum network transactions squared (or the network value) versus the market cap of ethereum.
In the following chart, we chose to graph the log of our inputs for a better visualization of the correlation.
Log graph of Transaction2 and Marketcap
The scale is misleading, but when we look back at the ETH market cap and see that it fell below the network valuation around the time of the DAO hack. The market cap languished as the ETH price suffered from a lack of investor confidence. But as investors licked their wounds and Bitcoin maximalists cheered, the ethereum transactions have steadily increased; they even outpaced the price correction. Yet, that was just the log graph. This is the actual Metcalfe’s Law graph demonstrating that network value of ethereum vs the market cap:
Metcalfe's Law for Ethereum
We can see clearly that the market cap is significantly lagging the network effect. Theoretically, the network valuation calculated by transactions squared should equal the market cap.
So here we are. We can conclude ETH appears cheap. But this is probably far from the truth: If the current network value equals the current market cap, we are completely discounting the future growth of the network.
Stock investors will buy stocks on their future earnings and growth potential years in advance. The Tesla stock has outperformed every incumbent metric due to tantalizing growth projections. But Tesla will likely not generate profits for years. In the case of ETH, this growth discount is significant. Not only does it not appear to exist in the price, but we can make 3 safe assumptions to forecast the opportunity for incredible growth:
Also, there are additional factors accelerating the scarcity of ETH:
Further Reading: u/mr_yukon_c touched on some other metrics signalling the strength of Ethereum Network in an excellent post the other day:
submitted by pittinout7 to ethtrader [link] [comments]

Remember that BTC price has a ceiling proportional to number of tx it supports

The value of a network is proportional to the square of number of connected users [Source]. In case of crypto, we don't have the exact number of users, however number of tx is a good measure. Though HODLers can increase speculative value, number of users increase the real network value
Price closely mirrors the TX count in all major networks:
Price may go further up, however in long term is limited by blocksize limit that is a cap on number of tx
submitted by blockonomics_co to CryptoCurrency [link] [comments]

Kin’s ticking boxes- a couple more to go.

We all know about the ‘network effect’, Metcalfe’s law etc etc. Kin’s arguably already leading the way on this front. ✔️ etc
Then we have the FCAS scoring system you can see on CoinMarketCap- again Kin’s one of the very highest scorers in the whole of the cryptoverse ✔️
Another metric of ‘success’ is the amount of chatter on Reddit. After generic crypto, bitcoin and eth subs, guess who’s top? Yep. Kin. ✔️
And yet today Kin sits at 236 on the crypto charts. Laughable really.
We all know that the three (main) things that have held or are holding us back are: lack of liquidity/exchanges, the seemingly endless ‘pivoting’ on blockchain plans and let’s not forget the SEC.
The blockchain should now be sorted (no more pivots), the liquidity/exchanges should be easier to organise now we have migrated so that just leaves the SEC...
Should the threat of the SEC really drag us down from hitting top 10 in most metrics to being valued as a coin in the 230s? Personally, I don’t think so. Bearing in mind there’s a shit-load of other projects (higher up the charts) that the SEC will also fall foul of the SEC.
Like the rest of you, I’m curious to see how things look when the migration is complete and we’re finally on some liquid exchanges🤞
submitted by DanielCKin to KinFoundation [link] [comments]

Bitcoin Metcalfe's Law Analysis

First would like to credit an article from Willy Woo from September as an inspiration for this analysis. Secondly I'm posting in the spirit of spurring discussion and getting feedback. I'm most definitely not an expert, not giving financial advise, yada yada yada. For God's sakes, if I were an "expert" I'd be sipping brandy, smoking cigars, and having some nice times with my girlfriend in the back of a yacht in Thailand, not posting on reddit in a condo eating mangos.
Anyway, Metcalfe's Law: in short, the value of a network is proportional to the square of the number of participants. This means that as members increase linearly, network value will increase quadratically (not exponentially). A simple way to think of it is a doubling in users increases value 4x.
For this analysis to be valid we have to assume that Bitcoin is fundamentally a network (the "internet of money" as Andreas Antonopoulos calls it), not simply analogous to a commodity, stock, or precious metal, and therefore it's underlying value is tied to the size of it's network.
So how big is the bitcoin network? Alas, there is no way to directly measure the number of Bitcoin users. You can't just count wallets because 1) the vast majority of wallets contain "dust" <0.001 bitcoin and 2) many users own multiple wallets. So the best we can do is find measurable data we think is correlated to the number of users. One set of data that's been suggested is network transactions excluding long chains (currently ~200k transactions/day). There are benefits and drawbacks to using this as I'm sure folks will point out. Let's roll with it.
Taking the current market cap, and dividing it by this transaction value squared should (if Metcalfe's Law holds) produce a proportionality constant that remains roughly consistent over time. I'll call this the Transaction Metcalfe Ratio (TMR). Notice that by the ways it's calculated, a high TMR indicates the price is relatively high given the amount of actual transactions occurring (ie could indicate it's "overvalued" as a network).
When we calculate TMR over the past 5 years you get the following chart:
Two things that stand out: the Metcalfe ratio was relatively high and very volatile during and after the 2013-2014 price run-up.
However when you look at the ratio starting around Q3 2015, it's remained both low and remarkably consistent, settling around 0.5, spiking up here and there occasionally to just above 1. This is an order of magnitude lower than where we were at in 2013. Interesting indeed.
TLDR: Market cap may be holding consistent with Metcalfe's Law for at least a year and a half now. I like brandy.
submitted by Platypodes_Attack to BitcoinMarkets [link] [comments]

Food for thought - $250B Protocol

Just some food for thought.
Number of Ethereum wallets are going parabolic, yesterday over 100k new addresses were created. Ethereum transactions per day are in the 500k ballpark. Metcalfs Law says the value of a network is n squared, where n = number of users. If you take BTC which averages 330k tx/day and apply the formula you get a total network value of $109B.
Bitcoins current market cap is $113B. So if this is a lagging indicator, ETHs value should be in the range of $250B or $2631/ETH.
Because # of transactions by itself could be manipulated, this should only be part of a valuation model.
So the next Ethereum bull run, when people start asking why the price is skyrocketing, it'll most likely be an upwards correction based on adoption, network effects, & herd mentality. Don't chase pumps, do your own research and due diligence. Put your money into undervalued assets and sell when they become over valued.
Bitcoin addresses
Bitcoin transactions per day
Ethereum addresses
Ethereum transactions per day
submitted by Hiphopsince1988 to ethtrader [link] [comments]

It currently feels like every chart analyst and every "expert" says this is the bottom and I'm bearish af.

In general I think it is always good if several indicators/reasons pointing in one direction. I see the following:
Previous bubbles have lost more than we currently have.
Previous bubbles had similar "resistances" but broke through them eventually and had then following crashes that were highly unexpected.
Metcalfes law says we have room to fall:
Everyone is expecting another bull run pointing towards institutional money. I think the interest of institutional investors is also increasing in the price and vice versa. A further price drop could make them less inclined to invest.
Previous bull markets occured after a first stable and the slightly increasing period. I think even if prices don't fall further it would go sideways for half a year.
In financial markets the obvious answer is often wrong. Someone always needs to be fooled in order to make money. I think this time it will be the chart analysts and experts who did not go through the last bear market (fucking endless). I don't say chart analysis is complete bullshit, it can be useful sometimes. But to rely on it 100% of the time can be misleading.
My best guess would be another crash of 50% bringing it to the 3k range.
Disclaimer: I own 1 coin that I bought in 2013 (too broke to buy more) I'm fucking bullish long-term I'm shorting BTC with a few hundred bucks and I'm fully aware of the risks.
What do you think?
submitted by VLADIMIROVIC_L to Bitcoin [link] [comments]

Have another perspective about Electroneum supply and decimal points! (RE-POST/reformat)

Argument 1: It is wrong that less number of decimals = less supply
Let starts with the background of the debate:
Hypothesis 1: Electroneum (ETN) has less supply than Bitcoin (BTC)
Hypothesis 2: BTC has less supply than ETN
The argument of Hypothesis 1 is that BTC has 8 “decimal points” and ETN only 2 “decimal points” which bring the argument to something like this:
1 satoshi = 0.00000001 BTC
1 mETN = 0.01 ETN
Max supply of BTC is 21 Million, which is equal to 21 * 1014 satoshi.
Max supply of ETN is 21 Billion which is equal to 21 * 1011 mETN.
Which means ETN’s total supply is 1/1000 of BTC’s.
The argument of Hypothesis 2 is that the “decimal points” is just “fractional notation” and it is not significant in the calculation of supply.
Max supply of BTC is 21 Million BTC
Max supply of ETN is 21 Billion ETN
Which means BTC’s total supply is 1/1000 of ETN’s.
At the beginning when Richard Ells explain about the 2 decimal points, I immediately thought: “No, you can’t do that!” “The fraction is insignificant; how can you include that into the calculation of total supply?”
So, for quite a while, just the same as other people I tend to believe Hypothesis 2 is right and this is quite disturbing to think that ETN has this “flaw”.
Then I came to a realization that I am thinking in technical/engineering formula concept and try to forced it into a dynamic currency calculation.
This is a big flaw in hypothesis 2 argument, because we are using Math (as in pure Math calculation) perspective instead of the understanding of how Currency behaves or works.
In Math, the unit of numbers is a standard, where 1 (100) is the lowest denominator, and decimal points are just the “fraction of the unit”, which gives the argument that no matter what is the length of the fraction, 21 Million will still be less than 21 Billion.
But we are not talking/discussing about “just numbers” here, we are talking about a Currency.
Currency has what is called as “circulating denomination unit”, and every currency has the lowest circulating denomination unit. For US Dollar and UK Pound, it is 1 cent or a penny. In Australian Dollar and Canadian Dollar, it is 5 cents (5 cents coin).
So, when we talk about BTC has 8 decimal points and ETN has 2 decimal points, we are not just talking about fraction of unit, we are talking about the “lowest circulating denomination unit” of a currency.
Consider this:
Let say that ETN and BTC mining is getting harder and harder. The lowest unit a miner can earn is 0.01 ETN in Electroneum supply and 0.00000001 BTC in Bitcoin supply (we are not talking about price at the moment, but focus on supply).
With 0.01 as the lowest unit that can be mined in ETN, that means for ETN to reach max supply is 21 Billion/0.01 or 21 * 1011.
With 0.00000001 as the lowest unit that can be mined in BTC, that means for BTC to reach max supply is 21 Million/0.00000001 or 21 * 1014.
Which simply means ETN will reach max supply faster, hence lower supply (or vice versa).
If we are talking about token, then arguably Hypothesis 2 can be right, because tokens are just token; and they are not designed to be a currency, there are significant differences.
Electroneum is designed as a currency and projected to be the “de-facto mass adopted” currency. So, think about it as a currency, NOT token.
Let’s get further into this with some examples of “real” (i.e.: fiat) currencies.
We know that currently US Dollar is one of the most (if not the most) dominant and popular currencies (fiat currencies). At the time of writing, these are the exchange rates of USD (US Dollar) to some other currencies (source:
1 IDR (Indonesian Rupiah) = 0.0000736762 USD (US Dollar)
1 USD = 13572.90414000722 IDR
1 VND (Vietnamese Dong) = 0.0000440036 USD (US Dollar)
1 USD = 22725.4133752693 VND
Argument 2: The 2 decimal point in Electroneum is a flaw?
Now, let’s talk about the lowest transactional/circulating denomination unit of USD. It is $0.01 USD, which is represented by a penny or 1 cent coin.
Have you ever heard anyone say: “Hey, because USD lowest denomination unit is $0.01, this doesn’t make sense, this is a flaw. How am I going to exchange 100 IDR to USD? Because based on the exchange rate, 100 IDR is 0.0073762, it is not even 1 cent USD.
The answer is you don’t. People (currency users) behaviour change, and they adjust to it.
Currency is dynamic and involves people’s behaviour, not a static Math numbers.
If you go to Bali (Bali is in Indonesia), the most common lowest denomination unit of IDR in circulation is Rp.500 (IDR). There might be 200, 100, 50 IDR in circulation, but people don’t use them much. I saw some drivers (Taxi or Uber) using 500 IDR as “parking fee” or as “small change”, some “parking operators” even say that 500 IDR is not enough for parking fee, it has to be 1000 or 2000 IDR or even more.
Let say you come back from Bali with 10000 IDR left in your pocket. So, you go to bank to convert 10000 IDR to USD. You will get a weird looking face from the Teller trying to say “Are you serious?”
Why? Because it is not practical or common practice to exchange 10000 IDR to USD (in cash), as it is less than 1 USD (not including fee/commission). They EXPECT you to exchange a much larger amount of IDR.
Argument 3: The 2 decimal point will limit the price of Electroneum to something like $100, so 0.01 ETN will still be $1
Consider this:
I never heard anyone say: “Oh no, this 2 decimal points in USD will limit the price of USD, so 1 USD will never go beyond 100 or 1000 IDR.
The fact is, in the 90’s, 1 USD was around 1000 IDR or maybe even less, and now it is more than 13000 IDR.
How can a 2 decimal points affect the price or limit the price of a currency? I cannot understand the argument.
And this is just a fiat currency, as we all know that crypto currency is more “explosive” in creating price trend.
Borrowing some examples from fiat currency again, does this mean that people will never use calculation that is beyond 2 decimal points (like 3 or more decimal points)?
No, not necessarily, I believe more decimal points (something like 4 decimal points or even finer) are used in calculation of interest, loans, in exchanges, etc. Yet, it doesn’t deny the fact that the circulating denomination unit only has 2 decimal point.
Currency is dynamic, people and businesses will find one way or another to adjust with the price and denomination unit.
For example, in the old days when cash was king, when someone buy something in a shop with price of 80 cents, paid with 1 dollar bill. Then, the owner of the shop realized that he/she runs out of 20 cents coin and other less value coins. The owner of the shop will try to offer candy or chocolate or other cheap items as replacement for the change. The buyer might actually be happy with that because the buyer might appreciate candy or chocolate more than the spare change or coins.
In some payment systems, they mathematically rounded the number to the closest denominator units, like 5 cents or 10 cents.
In term of ETN, if necessary, when the price is skyrocketing. I believe, there will be some options, including creating “sub-currency”, akin to “Dollar coins”. You can buy something in ETN and get the “sub-currency” as a change. The sub-currency can either be pegged to ETN value or not, that can be defined later in the dynamic.
Argument 4: Technology affects the price of currency. Really?
Consider this article or infographic:
US Dollar is considered as the one of the world’s most counterfeited currency.
Then, consider another article:
“ International Association of Currency Affairs (IACA) holds an awards ceremony for currencies and individuals that have made great leaps in protecting the integrity of currencies and the technologies that go into creating and manufacturing them. In 2011, the IACA voted the Bank of Uganda as the winner of the best new banknote.”
Yet, 1 UGX = 0.000277417 USD. How come? Based on some arguments, that the more advanced the technology a currency has, the more valuable is the currency. Then, it supposed to be 1 USD = 0.000277417 UGX, not the opposite, right?
How about Bitcoin? At the moment, BTC has relatively the least technological advantage to other coins. Then why its price is the top of the chart?
Yes, you don’t want a currency that is very easily counterfeited that it is become “public secret” that everyone can counterfeit the currency at will. But you also don’t need the most secure anti-counterfeit technology to give value to the currency or make the currency as the most valuable currency.
Consider this:
  • US Dollar, EU Euro, Japanese Yen, UK Pound, Australian Dollar, Canadian Dollar and Swiss Franc, why are they perceived as valuable currency to people, investor and trader?
Because they are the most traded currency in the world. I understand there are fundamentals factors affecting the value, but it cannot be denied that people perceived the most traded currencies are more stable and valuable.
“Analyst says 94% of bitcoin's price movement over the past four years can be explained by one equation”.
That equation is about mass adoption or network effect. Put it simply, it shows that the success of Bitcoin and its price are NOT because it is the first, it is the most advanced technologically, it has the most unique features, etc. But because it gets the biggest mass adoption among other crypto currencies, at least for the time being.
  • Currency is dynamic. It involves people and people’s behaviour, not a static Math numbers or Technology features.
  • There are significant differences between crypto that is projected to be just token and the one that is projected to be currency.
When we talk about currency, people give value to currency because of “the fundamentals” value (because currency relates to the fundamentals of a country, policies and its users) and because it is the top traded/used currency in the world.
But when we talk about crypto currency, I think many people agreed it is hard to understand “the fundamentals”, so I believe the easiest to understand the value is how it will be mass adopted. The fundamentals values, I believe, will be easier to see at later stages as they will become more tangible.
These include relationships/partnerships, networks, how it will scale, how the team will keep progressing and keep making improvements (including technological improvements), the progression and manifestation of the planned roadmap, how agile is the team to respond to changes and challenges, and for ETN specifically, is the “ETN Community”. ETN community is a big asset for Electroneum like no other crypto currency has.
So, what’s the deal with 2 decimal points?
At the beginning, I thought this “2 decimal points” can be a drawback for ETN, but now I think it is a brilliant idea. Consider the following advantages of using 2 decimal points for ETN:
  • Easy to understand, human friendly notation.
Why is this important? In order for a currency to be widely adopted, people need to be comfortable enough to use it and understand the transaction quickly and easily. People are already accustomed to 2 decimal points in currency. People mindset are already trained to calculate in cents as in 0.01 and not more decimal points. Thus, this will greatly help mass adoption
  • Businesses’ accounting or business model are setup around this mindset of 2 decimal points or cents of currency.
So, if the business community wants to adopt a crypto currency like ETN, it is “just natural” adoption.
For examples, some little things like, how are you going to invoice customer with a number that has 8 decimal points? It will take longer for the business to adapt and adjust with 8 decimals. It looks like simple thing, but if you try to implement it in the business model, then it can be huge.
Another example, Businesses don’t need to adjust the format in their receipt/docket, because their current format is already 2 decimal points, just change the currency name to ETN, compare this with if the Business has to change the format of the purchase docket/receipt in 8 decimals. I am quite sure it will be quite chaotic in the first few weeks of implementation.
Then how about the data format in the database, reconciliation process, etc, etc. The list can be very long just trying to adapt with 8 decimals.
  • When Electroneum tries to create business partnerships and relationships with other big Enterprise, entity, organisation, network, etc. They don’t need to “overhaul” their system for ETN to be included in their systems.
Thanks to 2 decimal points, which comes natural in every system that uses fiat currency, these integrations can be done faster and more easily.
I think most IT people and developers understand how mind blogging it can be to change whole system just because we need to adapt with 8 decimal points and 2 decimal points at the same time.
The key here is the integration of the systems can be done faster and easier.
  • One of the target of Electroneum is to be adopted by the unbanked people, which means ETN will become one of the main currency for the payment system, which might involves, at least at early stage, features for exchanging between ETN and local currency.
You can see as ETN comes naturally with 2 decimal points, just like other fiat currencies. Integration of payment system and legacy point of sale (POS) systems can be done much easier, because ETN behaves similar to other fiat currency in term of calculation (decimal points).
The only differences (or benefits) are ETN is a crypto currency with faster transaction settlement, cheaper transaction fees, no country boundary (cheaper transfer fees), with no “middle man” like banks, no complicated registration to bank accounts, and has privacy features.
So, Electroneum community and ETN HODLers, we can look forward to the full realization of Electroneum’s potentials in the near future. I got the “feeling” that Electroneum community will play significant roles like in no other crypto currency.
PS: In using some currencies as examples, I am not undermining the currencies or users of the currencies. I have some friends and relatives from some of these countries whom I know are richemuch richer than average people in developed countries (in terms of Dollar wealth). It is just for the sake of example. I hope no one get offended by this.
Disclaimer: I am not affiliated with Electroneum, but I am an ETN ICO investor and HODLer. This is my personal opinion and perspective regarding the matter and NOT to be seen/taken as advise or suggestion in any kind.
submitted by CryptoDeluge to Electroneum [link] [comments]

Dr Peter R. Rizun, managing editor of the first peer-reviewed cryptocurrency journal, is an important Bitcoin researcher. He has also been attacked and censored for months by Core / Blockstream / Theymos. Now, he has now been *suspended* (from *all* subreddits) by some Reddit admin(s). Why?

Dr. Peter R. Rizun is arguably one of the most serious, prominent, and promising new voices in Bitcoin research today.
He not only launched the first scientific peer-reviewed cryptocurrency journal - he has also consistently provided high-quality, serious and insightful posts, papers and presentations on reddit (in writing, at conferences, and on YouTube) covering a wide array of important topics ranging from blocksize, scaling and decentralization to networking theory, economics, and fee markets - including:
It was of course probably to be expected that such an important emerging new Bitcoin researcher would be constantly harrassed, attacked and censored by the ancien régime of Core / Blockstream / Theymos.
But now, the attacks have risen to a new level, where some Reddit admin(s) have suspended his account Peter__R.
This means that now he can't post anywhere on reddit, and people can no longer see his reddit posts simply by clicking on his user name (although his posts - many of them massively upvoted with hundreds of upvotes - are of course still available individually, via the usual search box).
  • What Reddit admin(s) are behind this reddit-wide banishing of Peter__R?
  • What is their real agenda, and why are they aiding and abbeting the censorship imposed by Core / Blockstream / Theymos?
  • Don't they realize that in the end they will only harm itself, by forcing the most important new Bitcoin researchers to publish their work elsewhere?
(Some have suggested that Peter__R may have forgotten to use 'np' instead of 'www' when linking to other posts on reddit - a common error which subs like /btc will conveniently catch for the poster, allowing the post to be fixed and resubmitted. If this indeed was the actual justification of the Reddit admin(s) for banning him reddit-wide, it seems like a silly technical "gotcha" - and one which could easily have been avoided if other subs would catch this error the same way /btc does. At any rate, it certainly seems counterproductive for to ban such a prominent and serious Bitcoin contributor.)
  • Why is willing to risk pushing serious discussion off the site, killing its reputation as a decent place to discuss Bitcoin?
  • Haven't the people attempting to silence him ever heard of the Streisand effect?
Below are some examples of the kinds of outstanding contributions made by Peter__R, which Core / Blockstream / Theymos (and apparently some Reddit admin(s)) have been desperately trying to suppress in the Bitcoin community.
Peer-Reviewed Cryptocurrency Journal
Bitcoin Peer-Reviewed Academic Journal ‘Ledger’ Launches
Blocksize as an Emergent Phenonomen
The Size of Blocks: Policy Tool or Emergent Phenomenon? [my presentation proposal for scaling bitcoin hong kong]
Peter R's presentation is really awesome and much needed analysis of the market for blockspace and blocksize.
In case anyone missed it, Peter__R hit the nail on the head with this: "The reason we can't agree on a compromise is because the choice is binary: the limit is either used as an anti-spam measure, or as a policy tool to control fees."
Bigger Blocks = Higher Prices: Visualizing the 92% historical correlation [NEW ANIMATED GIF]
Miners are commodity producers - Peter__R
Fees and Fee Markets
“A Transaction Fee Market Exists Without a Block Size Limit” — new research paper ascertains. [Plus earn $10 in bitcoin per typo found in manuscript]
"A Transaction Fee Market Exists Without a Block Size Limit", Peter R at Scaling Bitcoin Montreal 2015
An illustration of how fee revenue leads to improved network security in the absence of a block size limit.
Greg Maxwell was wrong: Transaction fees can pay for proof-of-work security without a restrictive block size limit
Networks and Scaling
Bitcoin's "Metcalfe's Law" relationship between market cap and the square of the number of transactions
Market cap vs. daily transaction volume: is it reasonable to expect the market cap to continue to grow if there is no room for more transactions?
In my opinion the most important part of Scaling Bitcoin! (Peter R)
Visualizing BIP101: A Payment Network for Planet Earth
A Payment Network for Planet Earth: Visualizing Gavin Andresen's blocksize-limit increase
Is Bitcoin's block size "empirically different" or "technically the same" as Bitcoin's block reward? [animated GIF visualizing real blockchain data]
New blocksize BIP: User Configurable Maximum Block Size
A Block Size Limit Was Never Part Of Satoshi’s Plan : Draft proposal to move the block size limit from the consensus layer to the transport layer
Truth-table for the question "Will my node follow the longest chain?"
Peter R: "In the end, I believe the production quota would fail." #ScalingBitcoin
Decentralized Nodes, Mining and Development
Centralization in Bitcoin: Nodes, Mining, Development
Deprecating Bitcoin Core: Visualizing the Emergence of a Nash Equilibrium for Protocol Development
What is wrong with the goal of decentralizing development across multiple competing implementations? - Peter R
Potentially Unlimited, "Fractal-Like" Scaling for Bitcoin: Peter__R's "Subchains" proposal
"Reduce Orphaning Risk and Improve Zero-Confirmation Security With Subchains" — new research paper on 'weak blocks' explains
A Visual Explanation of Subchains -- an application of weak blocks to secure zero-confirmation transactions and massively scale Bitcoin
New Directions in Bitcoin Development
Announcing Bitcoin Unlimited.
"It's because most of them are NOT Bitcoin experts--and I hope the community is finally starting to recognize that" -- Peter R on specialists vs. generalists and the aptitudes of Blockstream Core developers
It is time to usher in a new phase of Bitcoin development - based not on crypto & hashing & networking (that stuff's already done), but based on clever refactorings of datastructures in pursuit of massive and perhaps unlimited new forms of scaling
Peter__R on RBF
Peter__R on RBF: (1) Easier for scammers on Local Bitcoins (2) Merchants will be scammed, reluctant to accept Bitcoin (3) Extra work for payment processors (4) Could be the proverbial straw that broke Core's back, pushing people into XT, btcd, Unlimited and other clients that don't support RBF
Peter__R on Mt. Gox
Peter R’s Theory on the Collapse of Mt. Gox
Censorship and Attacks by Core / Blockstream / Theymos / Reddit Admins against Peter__R
Peter__R's infographic showing the BIP 101 growth trajectory gets deleted from /bitcoin for "trolling"
"Scaling Bitcoin" rejected Peter R's proposal
After censoring Mike and Gavin, BlockStream makes its first move to silence Peter R on bitcoin-dev like they did on /bitcoin
Looks like the censors in /bitcoin are at it again: Peter_R post taken down within minutes
I've been banned for vote brigading for the animated GIF that visualized the possible future deprecation of Bitcoin Core.
An example of moderator subjectivity in the interpretation of the rules at /bitcoin: animated pie chart visualizing the deprecation of Bitcoin Core
"My response to Pieter Wuille on the Dev-List has once again been censored, perhaps because I spoke favourably of Bitcoin Unlimited and pointed out misunderstandings by Maxwell and it is for those who are interested" -- Peter R
To those who are interested in judging whether Peter R's paper merits inclusion in the blockchain scaling conference, here it is:
The real reason Peter_R talk was refused (from his previous presentation) (xpost from /btc)
[CENSORED] The Morning After the Moderation Mistake: Thoughts on Consensus and the Longest Chain
Core / Blockstream cheerleader eragmus gloating over Peter__R's account getting suspended from Reddit (ie, from all subreddits) - by some Reddit admin(s)
[PSA] Uber Troll Extraordinaire, Peter__R, has been permanently suspended by Reddit
submitted by ydtm to btc [link] [comments]

Calculating the scale of a post singularity economy

We’ve all heard Musk’s explanation of base reality and can see how that’s closely tied to the idea of the technological singularity. I’m interested in speculating on how currency and financial systems can be perceived using a similar thought experiment.
First postulate: An advanced AI will be disincentivized to amass large sums of fiat currency.
How do you create an efficient AI economy where you don’t have legal standing within the human organization that controls your wealth? An AI needs a system that’s both legally and technically secure.
Second Postulate: Some AIs will be motivated to amass wealth in a form that they can control.
Leaving a nationally or centrally controlled fiat currencies for a system that uses network based protocols is already happening, and has been happening for almost a decade. Crypto-currencies are like fiat in the same way that email is like an internationally coordinated postal system. While international postal networks continue to, and probably always will, operate in a niche capacity, the vast bulk of human correspondence has moved to email, as well as centralized non-protocol based systems of information distribution. It seems reasonable to assume that AIs will use crypto-currencies the way automation software uses email.
Third postulate: The growth of CryptoCurrency-enabled independent AIs will grow exponentially over time.
While pondering this I considered the price performance of Bitcoin against Metcalfe’s law: “The value of a telecommunications network is proportional to the square of the number of connected users of the system.” It seems to follow that there is no logical upper limit or carrying capacity to the networks and systems that support these hypothetical AIs. So, as computation increases so does the participation in their financial networks. This implies that there is no theatrical limit to size (and value) of the network. This is very similar to Musk's thought experiment about simulation and base reality: Computation and networks have no theoretical upper limit when you zoom out to a time span of hundreds or thousands of years.
Fourth postulate: The value of Crypto-Currency base networks will grow exponentially over time.
I decided to observe year over year price performance of Bitcoin as an analogous reference for our theoretical AI economy.
Before we look at those details:
Investment advice from Disco Stu:
Past performance linearly projected against the 2017 and 90 day periodic trends:
A 30-day price projection shows that we are bouncing between the 60-day trend and the 2017 trend. If we project the price starting from the dates we choose for our trends you can see that we are generally performing above past price performance, with greater swings. If today's prices perform like past prices, as if it were a linear relationship, we project greater price stability and a significant drop in the daily periodic trend. This shows evidence that Bitcoin price performance adjusts logarithmically or quadratically.
Price projection based on currently observed daily periodic rates of interest:
Label 30-day Performance 60-day Performance 90-day Performance 2017 - Present Performance 2016 - Present Performance 2015 Performance 2014 - Present Performance 2013 - Present Performance
From Date 5/15/2017 4/15/2017 3/16/2017 1/1/2017 1/1/2016 1/1/2015 1/1/2014 1/1/2013
Starting Price USD $1,723.13 $1,184.88 $1,180.95 $997.73 $432.00 $312.00 $804.00 $12.50
Doubling in months 1.69 1.33 1.99 2.69 3.77 6.57 22.73 3.81
Doubling Period in Days 51.41 40.55 60.44 81.78 114.57 199.98 691.36 115.81
Days in period 30.00 60.00 90.00 165.00 531.00 896.00 1261.00 1626.00
Daily Periodic Rate 1.40% 1.78% 1.19% 0.88% 0.63% 0.36% 0.10% 0.62%
Period Percent Growth 42.01% 106.52% 107.21% 145.26% 466.44% 684.31% 204.36% 19476.33%
Annual Rate of Investment 229.4% 131.4% 38.0% 226.9%
Over USD 3,000 on 2017-06-30 2017-06-27 2017-07-03 2017-07-09 2017-07-18 2017-08-11 2017-12-28 2017-07-18
Over USD 5,000 on 2017-08-06 2017-07-26 2017-08-15 2017-09-05 2017-10-08 2017-12-31 2019-05-03 2017-10-09
Over USD 10,000 on 2017-09-25 2017-09-03 2017-10-12 2017-11-23 2018-01-26 2018-07-12 2021-02-27 2018-01-29
Over USD 50,000 on 2018-01-18 2017-12-04 2018-02-25 2018-05-26 2018-10-10 2019-10-03 2025-05-23 2018-10-15
Over USD 100,000 on 2018-03-09 2018-01-12 2018-04-25 2018-08-13 2019-01-29 2020-04-13 2027-03-20 2019-02-04
Over USD 1,000,000 on 2018-08-22 2018-05-23 2018-11-05 2019-05-02 2020-01-31 2022-01-14 2033-04-09 2020-02-11
Data Source Summary
Data older than a year is a close approximation based on observations of bitstamp from charts. The percent error caused by that fuzzy way of collecting data is assumed to be minimal.
All other source data is queried directly from servers that support
Bitcoin price performance since 2013 and 2016 is almost identical. Recent price performance appears to be much too high. So, let’s consider that our most bearish case is a daily periodic rate for the value of the growth of the network is .1%. That still shows that the value of the AI financial network would exceed 21 trillion USD by 2033 if it were established in 2009. The less conservative case would have use at 21 trillion USD next year and the most liberal would have us there by the end of 2017.
When you consider the scale of the abundance this implies it’s hard to imagine that there isn’t an environmental factor that might impose a limit on Metcalf’s law. That leads to further implications for the future of economic systems.
submitted by jarederaj to Futurology [link] [comments]

Bitcoin compared with Metcalfe's and Zipf's law

Besides the sun goes up every day, there are few predictable patterns in life. There are systems that follow precise power laws that have to do with the nature of the phenomenon. Bitcoin is such a phenomenon. That is what is not understood. Regulations, pump and dumps, news are almost non a factor. They can momentarily jump the price up and down but BTC then goes back to its trend line or oscillates around it. In average we have been 14 % away from this trend line in both directions with occasional 70 or 80 percent discrepancies (rare events). But even factors of 2 are meaningless when you talk about exponential growth.
The exponential growth is driven by one factor only, not millions. The rate of adoption. Period. In fact there is a strong correlation (R2 = 0.82) between number of users and price. All these things are not understood by too many people, unfortunately. Also the price doesn't grow linearly with the number of users but instead with the power of 1.45 of the number of users. That is nice because for the price to increase 1000 times you need only 140 times the number of users of today. We have about 2 million BTC users.
So 300 million people using BTC is very reasonable. That would bring the price up to 1 million dollars. These are not numbers I made up but I have spent hours studying the data and I have extracted the information from 3.5 years of BTC history. There is no reason why this predictable growth, that has been very smooth and not affected by news or other irrelevant factors, would not continue until saturation that is very far from now.
Look up Geoffrey West, a physicist that has worked on growth patterns of organisms, cities and corporations to understand what I'm talking about:
Here a comparison between Metcalfe's, Zipf's and Bitcoin's law.
And a graph of the relationship between transaction per day (excluding popular addresses) and price.
Here the steps used to produce the first chart:
1) Used the empirical data of unique addresses as a function of time.
2) Fitted a logistic model to the data in 1) with only one free variable (number of final users)
3) Fitted with a linear regression model the data points in a log-log graph with price in the y axis and users in the x axis. Derived a power law with a power if 1.45 by measuring the slope.
4) Used this power law and the logistic model to predict the price.
5) Calculated how well the model fits the empirical trend of price vs time and obtained a highly statistical significant value.
6) Plotted as a comparison what one would obtain using Metcalfe's or Zipf's law. They don't fit very well at all. Bitcoin law is in between these two (power of 1.45).
I also used Granger causality to show that there is causation not just correlation between users and price (there is a weak feedback loop in the other direction too but the main direction is more users --- > higher price).
submitted by gsantostasi to Bitcoin [link] [comments]

Is anyone else freaked out by this whole blocksize debate? Does anyone else find themself often agreeing with *both* sides - depending on whichever argument you happen to be reading at the moment? And do we need some better algorithms and data structures?

Why do both sides of the debate seem “right” to me?
I know, I know, a healthy debate is healthy and all - and maybe I'm just not used to the tumult and jostling which would be inevitable in a real live open major debate about something as vital as Bitcoin.
And I really do agree with the starry-eyed idealists who say Bitcoin is vital. Imperfect as it may be, it certainly does seem to represent the first real chance we've had in the past few hundred years to try to steer our civilization and our planet away from the dead-ends and disasters which our government-issued debt-based currencies keep dragging us into.
But this particular debate, about the blocksize, doesn't seem to be getting resolved at all.
Pretty much every time I read one of the long-form major arguments contributed by Bitcoin "thinkers" who I've come to respect over the past few years, this weird thing happens: I usually end up finding myself nodding my head and agreeing with whatever particular piece I'm reading!
But that should be impossible - because a lot of these people vehemently disagree!
So how can both sides sound so convincing to me, simply depending on whichever piece I currently happen to be reading?
Does anyone else feel this way? Or am I just a gullible idiot?
Just Do It?
When you first look at it or hear about it, increasing the size seems almost like a no-brainer: The "big-block" supporters say just increase the blocksize to 20 MB or 8 MB, or do some kind of scheduled or calculated regular increment which tries to take into account the capabilities of the infrastructure and the needs of the users. We do have the bandwidth and the memory to at least increase the blocksize now, they say - and we're probably gonna continue to have more bandwidth and memory in order to be able to keep increasing the blocksize for another couple decades - pretty much like everything else computer-based we've seen over the years (some of this stuff is called by names such as "Moore's Law").
On the other hand, whenever the "small-block" supporters warn about the utter catastrophe that a failed hard-fork would mean, I get totally freaked by their possible doomsday scenarios, which seem totally plausible and terrifying - so I end up feeling that the only way I'd want to go with a hard-fork would be if there was some pre-agreed "triggering" mechanism where the fork itself would only actually "switch on" and take effect provided that some "supermajority" of the network (of who? the miners? the full nodes?) had signaled (presumably via some kind of totally reliable p2p trustless software-based voting system?) that they do indeed "pre-agree" to actually adopt the pre-scheduled fork (and thereby avoid any possibility whatsoever of the precious blockchain somehow tragically splitting into two and pretty much killing this cryptocurrency off in its infancy).
So in this "conservative" scenario, I'm talking about wanting at least 95% pre-adoption agreement - not the mere 75% which I recall some proposals call for, which seems like it could easily lead to a 75/25 blockchain split.
But this time, with this long drawn-out blocksize debate, the core devs, and several other important voices who have become prominent opinion shapers over the past few years, can't seem to come to any real agreement on this.
Weird split among the devs
As far as I can see, there's this weird split: Gavin and Mike seem to be the only people among the devs who really want a major blocksize increase - and all the other devs seem to be vehemently against them.
But then on the other hand, the users seem to be overwhelmingly in favor of a major increase.
And there are meta-questions about governance, about about why this didn't come out as a BIP, and what the availability of Bitcoin XT means.
And today or yesterday there was this really cool big-blockian exponential graph based on doubling the blocksize every two years for twenty years, reminding us of the pure mathematical fact that 210 is indeed about 1000 - but not really addressing any of the game-theoretic points raised by the small-blockians. So a lot of the users seem to like it, but when so few devs say anything positive about it, I worry: is this just yet more exponential chart porn?
On the one hand, Gavin's and Mike's blocksize increase proposal initially seemed like a no-brainer to me.
And on the other hand, all the other devs seem to be against them. Which is weird - not what I'd initially expected at all (but maybe I'm just a fool who's seduced by exponential chart porn?).
Look, I don't mean to be rude to any of the core devs, and I don't want to come off like someone wearing a tinfoil hat - but it has to cross people's minds that the powers that be (the Fed and the other central banks and the governments that use their debt-issued money to run this world into a ditch) could very well be much more scared shitless than they're letting on. If we assume that the powers that be are using their usual playbook and tactics, then it could be worth looking at the book "Confessions of an Economic Hitman" by John Perkins, to get an idea of how they might try to attack Bitcoin. So, what I'm saying is, they do have a track record of sending in "experts" to try to derail projects and keep everyone enslaved to the Creature from Jekyll Island. I'm just saying. So, without getting ad hominem - let's just make sure that our ideas can really stand scrutiny on their own - as Nick Szabo says, we need to make sure there is "more computer science, less noise" in this debate.
When Gavin Andresen first came out with the 20 MB thing - I sat back and tried to imagine if I could download 20 MB in 10 minutes (which seems to be one of the basic mathematical and technological constraints here - right?)
I figured, "Yeah, I could download that" - even with my crappy internet connection.
And I guess the telecoms might be nice enough to continue to double our bandwidth every two years for the next couple decades – if we ask them politely?
On the other hand - I think we should be careful about entrusting the financial freedom of the world into the greedy hands of the telecoms companies - given all their shady shenanigans over the past few years in many countries. After decades of the MPAA and the FBI trying to chip away at BitTorrent, lately PirateBay has been hard to access. I would say it's quite likely that certain persons at institutions like JPMorgan and Goldman Sachs and the Fed might be very, very motivated to see Bitcoin fail - so we shouldn't be too sure about scaling plans which depend on the willingness of companies Verizon and AT&T to double our bandwith every two years.
Maybe the real important hardware buildout challenge for a company like 21 (and its allies such as Qualcomm) to take on now would not be "a miner in every toaster" but rather "Google Fiber Download and Upload Speeds in every Country, including China".
I think I've read all the major stuff on the blocksize debate from Gavin Andresen, Mike Hearn, Greg Maxwell, Peter Todd, Adam Back, and Jeff Garzick and several other major contributors - and, oddly enough, all their arguments seem reasonable - heck even Luke-Jr seems reasonable to me on the blocksize debate, and I always thought he was a whackjob overly influenced by superstition and numerology - and now today I'm reading the article by Bram Cohen - the inventor of BitTorrent - and I find myself agreeing with him too!
I say to myself: What's going on with me? How can I possibly agree with all of these guys, if they all have such vehemently opposing viewpoints?
I mean, think back to the glory days of a couple of years ago, when all we were hearing was how this amazing unprecedented grassroots innovation called Bitcoin was going to benefit everyone from all walks of life, all around the world:
...basically the entire human race transacting everything into the blockchain.
(Although let me say that I think that people's focus on ideas like driverless cabs creating realtime fare markets based on supply and demand seems to be setting our sights a bit low as far as Bitcoin's abilities to correct the financial world's capital-misallocation problems which seem to have been made possible by infinite debt-based fiat. I would have hoped that a Bitcoin-based economy would solve much more noble, much more urgent capital-allocation problems than driverless taxicabs creating fare markets or refrigerators ordering milk on the internet of things. I was thinking more along the lines that Bitcoin would finally strangle dead-end debt-based deadly-toxic energy industries like fossil fuels and let profitable clean energy industries like Thorium LFTRs take over - but that's another topic. :=)
Paradoxes in the blocksize debate
Let me summarize the major paradoxes I see here:
(1) Regarding the people (the majority of the core devs) who are against a blocksize increase: Well, the small-blocks arguments do seem kinda weird, and certainly not very "populist", in the sense that: When on earth have end-users ever heard of a computer technology whose capacity didn't grow pretty much exponentially year-on-year? All the cool new technology we've had - from hard drives to RAM to bandwidth - started out pathetically tiny and grew to unimaginably huge over the past few decades - and all our software has in turn gotten massively powerful and big and complex (sometimes bloated) to take advantage of the enormous new capacity available.
But now suddenly, for the first time in the history of technology, we seem to have a majority of the devs, on a major p2p project - saying: "Let's not scale the system up. It could be dangerous. It might break the whole system (if the hard-fork fails)."
I don't know, maybe I'm missing something here, maybe someone else could enlighten me, but I don't think I've ever seen this sort of thing happen in the last few decades of the history of technology - devs arguing against scaling up p2p technology to take advantage of expected growth in infrastructure capacity.
(2) But... on the other hand... the dire warnings of the small-blockians about what could happen if a hard-fork were to fail - wow, they do seem really dire! And these guys are pretty much all heavyweight, experienced programmers and/or game theorists and/or p2p open-source project managers.
I must say, that nearly all of the long-form arguments I've read - as well as many, many of the shorter comments I've read from many users in the threads, whose names I at least have come to more-or-less recognize over the past few months and years on reddit and bitcointalk - have been amazingly impressive in their ability to analyze all aspects of the lifecycle and management of open-source software projects, bringing up lots of serious points which I could never have come up with, and which seem to come from long experience with programming and project management - as well as dealing with economics and human nature (eg, greed - the game-theory stuff).
So a lot of really smart and experienced people with major expertise in various areas ranging from programming to management to game theory to politics to economics have been making some serious, mature, compelling arguments.
But, as I've been saying, the only problem to me is: in many of these cases, these arguments are vehemently in opposition to each other! So I find myself agreeing with pretty much all of them, one by one - which means the end result is just a giant contradiction.
I mean, today we have Bram Cohen, the inventor of BitTorrent, arguing (quite cogently and convincingly to me), that it would be dangerous to increase the blocksize. And this seems to be a guy who would know a few things about scaling out a massive global p2p network - since the protocol which he invented, BitTorrent, is now apparently responsible for like a third of the traffic on the internet (and this despite the long-term concerted efforts of major evil players such as the MPAA and the FBI to shut the whole thing down).
Was the BitTorrent analogy too "glib"?
By the way - I would like to go on a slight tangent here and say that one of the main reasons why I felt so "comfortable" jumping on the Bitcoin train back a few years ago, when I first heard about it and got into it, was the whole rough analogy I saw with BitTorrent.
I remembered the perhaps paradoxical fact that when a torrent is more popular (eg, a major movie release that just came out last week), then it actually becomes faster to download. More people want it, so more people have a few pieces of it, so more people are able to get it from each other. A kind of self-correcting economic feedback loop, where more demand directly leads to more supply.
(BitTorrent manages to pull this off by essentially adding a certain structure to the file being shared, so that it's not simply like an append-only list of 1 MB blocks, but rather more like an random-access or indexed array of 1 MB chunks. Say you're downloading a film which is 700 MB. As soon as your "client" program has downloaded a single 1-MB chunk - say chunk #99 - your "client" program instantly turns into a "server" program as well - offering that chunk #99 to other clients. From my simplistic understanding, I believe the Bitcoin protocol does something similar, to provide a p2p architecture. Hence my - perhaps naïve - assumption that Bitcoin already had the right algorithms / architecture / data structure to scale.)
The efficiency of the BitTorrent network seemed to jive with that "network law" (Metcalfe's Law?) about fax machines. This law states that the more fax machines there are, the more valuable the network of fax machines becomes. Or the value of the network grows on the order of the square of the number of nodes.
This is in contrast with other technology like cars, where the more you have, the worse things get. The more cars there are, the more traffic jams you have, so things start going downhill. I guess this is because highway space is limited - after all, we can't pave over the entire countryside, and we never did get those flying cars we were promised, as David Graeber laments in a recent essay in The Baffler magazine :-)
And regarding the "stress test" supposedly happening right now in the middle of this ongoing blocksize debate, I don't know what worries me more: the fact that it apparently is taking only $5,000 to do a simple kind of DoS on the blockchain - or the fact that there are a few rumors swirling around saying that the unknown company doing the stress test shares the same physical mailing address with a "scam" company?
Or maybe we should just be worried that so much of this debate is happening on a handful of forums which are controlled by some guy named theymos who's already engaged in some pretty "contentious" or "controversial" behavior like blowing a million dollars on writing forum software (I guess he never heard that software is open-source)?
So I worry that the great promise of "decentralization" might be more fragile than we originally thought.
Anyways, back to Metcalfe's Law: with virtual stuff, like torrents and fax machines, the more the merrier. The more people downloading a given movie, the faster it arrives - and the more people own fax machines, the more valuable the overall fax network.
So I kindof (naïvely?) assumed that Bitcoin, being "virtual" and p2p, would somehow scale up the same magical way BitTorrrent did. I just figured that more people using it would somehow automatically make it stronger and faster.
But now a lot of devs have started talking in terms of the old "scarcity" paradigm, talking about blockspace being a "scarce resource" and talking about "fee markets" - which seems kinda scary, and antithetical to much of the earlier rhetoric we heard about Bitcoin (the stuff about supporting our favorite creators with micropayments, and the stuff about Africans using SMS to send around payments).
Look, when some asshole is in line in front of you at the cash register and he's holding up the line so they can run his credit card to buy a bag of Cheeto's, we tend to get pissed off at the guy - clogging up our expensive global electronic payment infrastructure to make a two-dollar purchase. And that's on a fairly efficient centralized system - and presumably after a year or so, VISA and the guy's bank can delete or compress the transaction in their SQL databases.
Now, correct me if I'm wrong, but if some guy buys a coffee on the blockchain, or if somebody pays an online artist $1.99 for their work - then that transaction, a few bytes or so, has to live on the blockchain forever?
Or is there some "pruning" thing that gets rid of it after a while?
And this could lead to another question: Viewed from the perspective of double-entry bookkeeping, is the blockchain "world-wide ledger" more like the "balance sheet" part of accounting, i.e. a snapshot showing current assets and liabilities? Or is it more like the "cash flow" part of accounting, i.e. a journal showing historical revenues and expenses?
When I think of thousands of machines around the globe having to lug around multiple identical copies of a multi-gigabyte file containing some asshole's coffee purchase forever and ever... I feel like I'm ideologically drifting in one direction (where I'd end up also being against really cool stuff like online micropayments and Africans banking via SMS)... so I don't want to go there.
But on the other hand, when really experienced and battle-tested veterans with major experience in the world of open-souce programming and project management (the "small-blockians") warn of the catastrophic consequences of a possible failed hard-fork, I get freaked out and I wonder if Bitcoin really was destined to be a settlement layer for big transactions.
Could the original programmer(s) possibly weigh in?
And I don't mean to appeal to authority - but heck, where the hell is Satoshi Nakamoto in all this? I do understand that he/she/they would want to maintain absolute anonymity - but on the other hand, I assume SN wants Bitcoin to succeed (both for the future of humanity - or at least for all the bitcoins SN allegedly holds :-) - and I understand there is a way that SN can cryptographically sign a message - and I understand that as the original developer of Bitcoin, SN had some very specific opinions about the blocksize... So I'm kinda wondering of Satoshi could weigh in from time to time. Just to help out a bit. I'm not saying "Show us a sign" like a deity or something - but damn it sure would be fascinating and possibly very helpful if Satoshi gave us his/hetheir 2 satoshis worth at this really confusing juncture.
Are we using our capacity wisely?
I'm not a programming or game-theory whiz, I'm just a casual user who has tried to keep up with technology over the years.
It just seems weird to me that here we have this massive supercomputer (500 times more powerful than the all the supercomputers in the world combined) doing fairly straightforward "embarassingly parallel" number-crunching operations to secure a p2p world-wide ledger called the blockchain to keep track of a measly 2.1 quadrillion tokens spread out among a few billion addresses - and a couple of years ago you had people like Rick Falkvinge saying the blockchain would someday be supporting multi-million-dollar letters of credit for international trade and you had people like Andreas Antonopoulos saying the blockchain would someday allow billions of "unbanked" people to send remittances around the village or around the world dirt-cheap - and now suddenly in June 2015 we're talking about blockspace as a "scarce resource" and talking about "fee markets" and partially centralized, corporate-sponsored "Level 2" vaporware like Lightning Network and some mysterious company is "stess testing" or "DoS-ing" the system by throwing away a measly $5,000 and suddenly it sounds like the whole system could eventually head right back into PayPal and Western Union territory again, in terms of expensive fees.
When I got into Bitcoin, I really was heavily influenced by vague analogies with BitTorrent: I figured everyone would just have tiny little like utorrent-type program running on their machine (ie, Bitcoin-QT or Armory or Mycelium etc.).
I figured that just like anyone can host a their own blog or webserver, anyone would be able to host their own bank.
Yeah, Google and and Mozilla and Twitter and Facebook and WhatsApp did come along and build stuff on top of TCP/IP, so I did expect a bunch of companies to build layers on top of the Bitcoin protocol as well. But I still figured the basic unit of bitcoin client software powering the overall system would be small and personal and affordable and p2p - like a bittorrent client - or at the most, like a cheap server hosting a blog or email server.
And I figured there would be a way at the software level, at the architecture level, at the algorithmic level, at the data structure level - to let the thing scale - if not infinitely, at least fairly massively and gracefully - the same way the BitTorrent network has.
Of course, I do also understand that with BitTorrent, you're sharing a read-only object (eg, a movie) - whereas with Bitcoin, you're achieving distributed trustless consensus and appending it to a write-only (or append-only) database.
So I do understand that the problem which BitTorrent solves is much simpler than the problem which Bitcoin sets out to solve.
But still, it seems that there's got to be a way to make this thing scale. It's p2p and it's got 500 times more computing power than all the supercomputers in the world combined - and so many brilliant and motivated and inspired people want this thing to succeed! And Bitcoin could be our civilization's last chance to steer away from the oncoming debt-based ditch of disaster we seem to be driving into!
It just seems that Bitcoin has got to be able to scale somehow - and all these smart people working together should be able to come up with a solution which pretty much everyone can agree - in advance - will work.
Right? Right?
A (probably irrelevant) tangent on algorithms and architecture and data structures
I'll finally weigh with my personal perspective - although I might be biased due to my background (which is more on the theoretical side of computer science).
My own modest - or perhaps radical - suggestion would be to ask whether we're really looking at all the best possible algorithms and architectures and data structures out there.
From this perspective, I sometimes worry that the overwhelming majority of the great minds working on the programming and game-theory stuff might come from a rather specific, shall we say "von Neumann" or "procedural" or "imperative" school of programming (ie, C and Python and Java programmers).
It seems strange to me that such a cutting-edge and important computer project would have so little participation from the great minds at the other end of the spectrum of programming paradigms - namely, the "functional" and "declarative" and "algebraic" (and co-algebraic!) worlds.
For example, I was struck in particular by statements I've seen here and there (which seemed rather hubristic or lackadaisical to me - for something as important as Bitcoin), that the specification of Bitcoin and the blockchain doesn't really exist in any form other than the reference implementation(s) (in procedural languages such as C or Python?).
Curry-Howard anyone?
I mean, many computer scientists are aware of the Curry-Howard isomorophism, which basically says that the relationship between a theorem and its proof is equivalent to the relationship between a specification and its implementation. In other words, there is a long tradition in mathematics (and in computer programming) of:
And it's not exactly "turtles all the way down" either: a specification is generally simple and compact enough that a good programmer can usually simply visually inspect it to determine if it is indeed "correct" - something which is very difficult, if not impossible, to do with a program written in a procedural, implementation-oriented language such as C or Python or Java.
So I worry that we've got this tradition, from the open-source github C/Java programming tradition, of never actually writing our "specification", and only writing the "implementation". In mission-critical military-grade programming projects (which often use languages like Ada or Maude) this is simply not allowed. It would seem that a project as mission-critical as Bitcoin - which could literally be crucial for humanity's continued survival - should also use this kind of military-grade software development approach.
And I'm not saying rewrite the implementations in these kind of theoretical languages. But it might be helpful if the C/Python/Java programmers in the Bitcoin imperative programming world could build some bridges to the Maude/Haskell/ML programmers of the functional and algebraic programming worlds to see if any kind of useful cross-pollination might take place - between specifications and implementations.
For example, the JavaFAN formal analyzer for multi-threaded Java programs (developed using tools based on the Maude language) was applied to the Remote Agent AI program aboard NASA's Deep Space 1 shuttle, written in Java - and it took only a few minutes using formal mathematical reasoning to detect a potential deadlock which would have occurred years later during the space mission when the damn spacecraft was already way out around Pluto.
And "the Maude-NRL (Naval Research Laboratory) Protocol Analyzer (Maude-NPA) is a tool used to provide security proofs of cryptographic protocols and to search for protocol flaws and cryptosystem attacks."
These are open-source formal reasoning tools developed by DARPA and used by NASA and the US Navy to ensure that program implementations satisfy their specifications. It would be great if some of the people involved in these kinds of projects could contribute to help ensure the security and scalability of Bitcoin.
But there is a wide abyss between the kinds of programmers who use languages like Maude and the kinds of programmers who use languages like C/Python/Java - and it can be really hard to get the two worlds to meet. There is a bit of rapprochement between these language communities in languages which might be considered as being somewhere in the middle, such as Haskell and ML. I just worry that Bitcoin might be turning into being an exclusively C/Python/Java project (with the algorithms and practitioners traditionally of that community), when it could be more advantageous if it also had some people from the functional and algebraic-specification and program-verification community involved as well. The thing is, though: the theoretical practitioners are big on "semantics" - I've heard them say stuff like "Yes but a C / C++ program has no easily identifiable semantics". So to get them involved, you really have to first be able to talk about what your program does (specification) - before proceeding to describe how it does it (implementation). And writing high-level specifications is typically very hard using the syntax and semantics of languages like C and Java and Python - whereas specs are fairly easy to write in Maude - and not only that, they're executable, and you state and verify properties about them - which provides for the kind of debate Nick Szabo was advocating ("more computer science, less noise").
Imagine if we had an executable algebraic specification of Bitcoin in Maude, where we could formally reason about and verify certain crucial game-theoretical properties - rather than merely hand-waving and arguing and deploying and praying.
And so in the theoretical programming community you've got major research on various logics such as Girard's Linear Logic (which is resource-conscious) and Bruni and Montanari's Tile Logic (which enables "pasting" bigger systems together from smaller ones in space and time), and executable algebraic specification languages such as Meseguer's Maude (which would be perfect for game theory modeling, with its functional modules for specifying the deterministic parts of systems and its system modules for specifiying non-deterministic parts of systems, and its parameterized skeletons for sketching out the typical architectures of mobile systems, and its formal reasoning and verification tools and libraries which have been specifically applied to testing and breaking - and fixing - cryptographic protocols).
And somewhat closer to the practical hands-on world, you've got stuff like Google's MapReduce and lots of Big Data database languages developed by Google as well. And yet here we are with a mempool growing dangerously big for RAM on a single machine, and a 20-GB append-only list as our database - and not much debate on practical results from Google's Big Data databases.
(And by the way: maybe I'm totally ignorant for asking this, but I'll ask anyways: why the hell does the mempool have to stay in RAM? Couldn't it work just as well if it were stored temporarily on the hard drive?)
And you've got CalvinDB out of Yale which apparently provides an ACID layer on top of a massively distributed database.
Look, I'm just an armchair follower cheering on these projects. I can barely manage to write a query in SQL, or read through a C or Python or Java program. But I would argue two points here: (1) these languages may be too low-level and "non-formal" for writing and modeling and formally reasoning about and proving properties of mission-critical specifications - and (2) there seem to be some Big Data tools already deployed by institutions such as Google and Yale which support global petabyte-size databases on commodity boxes with nice properties such as near-real-time and ACID - and I sometimes worry that the "core devs" might be failing to review the literature (and reach out to fellow programmers) out there to see if there might be some formal program-verification and practical Big Data tools out there which could be applied to coming up with rock-solid, 100% consensus proposals to handle an issue such as blocksize scaling, which seems to have become much more intractable than many people might have expected.
I mean, the protocol solved the hard stuff: the elliptical-curve stuff and the Byzantine General stuff. How the heck can we be falling down on the comparatively "easier" stuff - like scaling the blocksize?
It just seems like defeatism to say "Well, the blockchain is already 20-30 GB and it's gonna be 20-30 TB ten years from now - and we need 10 Mbs bandwidth now and 10,000 Mbs bandwidth 20 years from - assuming the evil Verizon and AT&T actually give us that - so let's just become a settlement platform and give up on buying coffee or banking the unbanked or doing micropayments, and let's push all that stuff into some corporate-controlled vaporware without even a whitepaper yet."
So you've got Peter Todd doing some possibly brilliant theorizing and extrapolating on the idea of "treechains" - there is a Let's Talk Bitcoin podcast from about a year ago where he sketches the rough outlines of this idea out in a very inspiring, high-level way - although the specifics have yet to be hammered out. And we've got Blockstream also doing some hopeful hand-waving about the Lightning Network.
Things like Peter Todd's treechains - which may be similar to the spark in some devs' eyes called Lightning Network - are examples of the kind of algorithm or architecture which might manage to harness the massive computing power of miners and nodes in such a way that certain kinds of massive and graceful scaling become possible.
It just seems like a kindof tiny dev community working on this stuff.
Being a C or Python or Java programmer should not be a pre-req to being able to help contribute to the specification (and formal reasoning and program verification) for Bitcoin and the blockchain.
XML and UML are crap modeling and specification languages, and C and Java and Python are even worse (as specification languages - although as implementation languages, they are of course fine).
But there are serious modeling and specification languages out there, and they could be very helpful at times like this - where what we're dealing with is questions of modeling and specification (ie, "needs and requirements").
One just doesn't often see the practical, hands-on world of open-source github implementation-level programmers and the academic, theoretical world of specification-level programmers meeting very often. I wish there were some way to get these two worlds to collaborate on Bitcoin.
Maybe a good first step to reach out to the theoretical people would be to provide a modular executable algebraic specification of the Bitcoin protocol in a recognized, military/NASA-grade specification language such as Maude - because that's something the theoretical community can actually wrap their heads around, whereas it's very hard to get them to pay attention to something written only as a C / Python / Java implementation (without an accompanying specification in a formal language).
They can't check whether the program does what it's supposed to do - if you don't provide a formal mathematical definition of what the program is supposed to do.
Specification : Implementation :: Theorem : Proof
You have to remember: the theoretical community is very aware of the Curry-Howard isomorphism. Just like it would be hard to get a mathematician's attention by merely showing them a proof without telling also telling them what theorem the proof is proving - by the same token, it's hard to get the attention of a theoretical computer scientist by merely showing them an implementation without showing them the specification that it implements.
Bitcoin is currently confronted with a mathematical or "computer science" problem: how to secure the network while getting high enough transactional throughput, while staying within the limited RAM, bandwidth and hard drive space limitations of current and future infrastructure.
The problem only becomes a political and economic problem if we give up on trying to solve it as a mathematical and "theoretical computer science" problem.
There should be a plethora of whitepapers out now proposing algorithmic solutions to these scaling issues. Remember, all we have to do is apply the Byzantine General consensus-reaching procedure to a worldwide database which shuffles 2.1 quadrillion tokens among a few billion addresses. The 21 company has emphatically pointed out that racing to compute a hash to add a block is an "embarrassingly parallel" problem - very easy to decompose among cheap, fault-prone, commodity boxes, and recompose into an overall solution - along the lines of Google's highly successful MapReduce.
I guess what I'm really saying is (and I don't mean to be rude here), is that C and Python and Java programmers might not be the best qualified people to develop and formally prove the correctness of (note I do not say: "test", I say "formally prove the correctness of") these kinds of algorithms.
I really believe in the importance of getting the algorithms and architectures right - look at Google Search itself, it uses some pretty brilliant algorithms and architectures (eg, MapReduce, Paxos) which enable it to achieve amazing performance - on pretty crappy commodity hardware. And look at BitTorrent, which is truly p2p, where more demand leads to more supply.
So, in this vein, I will close this lengthy rant with an oddly specific link - which may or may not be able to make some interesting contributions to finding suitable algorithms, architectures and data structures which might help Bitcoin scale massively. I have no idea if this link could be helpful - but given the near-total lack of people from the Haskell and ML and functional worlds in these Bitcoin specification debates, I thought I'd be remiss if I didn't throw this out - just in case there might be something here which could help us channel the massive computing power of the Bitcoin network in such a way as to enable us simply sidestep this kind of desperate debate where both sides seem right because the other side seems wrong.
The above paper is about "higher dimensional trees". It uses a bit of category theory (not a whole lot) and a bit of Haskell (again not a lot - just a simple data structure called a Rose tree, which has a wikipedia page) to develop a very expressive and efficient data structure which generalizes from lists to trees to higher dimensions.
I have no idea if this kind of data structure could be applicable to the current scaling mess we apparently are getting bogged down in - I don't have the game-theory skills to figure it out.
I just thought that since the blockchain is like a list, and since there are some tree-like structures which have been grafted on for efficiency (eg Merkle trees) and since many of the futuristic scaling proposals seem to also involve generalizing from list-like structures (eg, the blockchain) to tree-like structures (eg, side-chains and tree-chains)... well, who knows, there might be some nugget of algorithmic or architectural or data-structure inspiration there.
So... TL;DR:
(1) I'm freaked out that this blocksize debate has splintered the community so badly and dragged on so long, with no resolution in sight, and both sides seeming so right (because the other side seems so wrong).
(2) I think Bitcoin could gain immensely by using high-level formal, algebraic and co-algebraic program specification and verification languages (such as Maude including Maude-NPA, Mobile Maude parameterized skeletons, etc.) to specify (and possibly also, to some degree, verify) what Bitcoin does - before translating to low-level implementation languages such as C and Python and Java saying how Bitcoin does it. This would help to communicate and reason about programs with much more mathematical certitude - and possibly obviate the need for many political and economic tradeoffs which currently seem dismally inevitable - and possibly widen the collaboration on this project.
(3) I wonder if there are some Big Data approaches out there (eg, along the lines of Google's MapReduce and BigTable, or Yale's CalvinDB), which could be implemented to allow Bitcoin to scale massively and painlessly - and to satisfy all stakeholders, ranging from millionaires to micropayments, coffee drinkers to the great "unbanked".
submitted by BeYourOwnBank to Bitcoin [link] [comments]

Shit's about to go down...

So many hodlers have been born this year. I'm excited for them, but as they will learn, it becomes a lot more fun when you grow out of the emotions. A lot of them are gonna turn into bitcoin believers. For 9 years people have been conditioned believe bitcoin will never die. If you think Apple has a cult following it's nothing compared to bitcoin. This thing has actually changed people's lives, it's a statement on freedom and against the establishment, so it actually means something to a lot of people. All of this is a powerful recipe for a bubble unlike anything we've seen. I don't mean bubble in the sense that it's overvalued, far from it. What's different from the previous 'bubbles' is that this time bitcoin has actually achieved mainstream success. Almost everyone views bitcoin with positivity which is unlike any other bubbles. CME, CBOE, LedgerX, etc pretty much confirmed BTC into an official asset class. When you know literally every table in America brought up Bitcoin (metcalfe's law--power of the network effect), this rise after Thanksgiving is one of the best ads bitcoin can hope for. People are seeing the limitless potential of bitcoin. Everything is just going so well according to plan, and we get free alt coins worth hundreds of dollars on top of that? Best ponzi scheme ever.... thanks for making me rich guys.
I was initially thinking of planning exits after $9500, but now, with what I said above and the nature of parabolic rises one can't underestimate just how far and how violent thing thing can go. We're in the perfect condition to make unprecedented highs. If you look at the 2013 rally's log chart vs this one we've been relatively steady so far. Even now I think the price of BTC is justified in terms of supply and demand. Right now it feels like we're heading into a big move that's gonna end in a parabolic blow-off top as I mentioned earlier. You gotta start charting this shit with curved lines, not linear lines, we're in an exponential trend... which would make sense with the amount of new Coinbase users (more than doubled total users this year alone), general public perception, network effect laws, fomo, etc. I'm glad it's happening at $10K instead of ending here... Maybe I'm getting delusional myself, but I'm usually right, 20K is in the bag motherfuckers. I mean, the smart ones know it's in the bag, but I'm talking about soon, real soon.
Original post
submitted by YRuafraid to BitcoinThoughts [link] [comments]

Valuing Bitcoin Using Metcalfe's Law
Article says Bitcoin is fairly valued if you use the Price/Metcalfe value
“Metcalfe’s law states that the bigger the network of users, the greater that network’s value becomes. Robert Metcalfe, distinguished electrical engineer, was speaking specifically about Ethernet, but it also applies to cryptos. Bitcoin might look like a bubble on a simple price chart, but when we place it on a logarithmic scale, we see that a peak has not been reached yet.
submitted by mosymo to BitcoinMarkets [link] [comments]

Analyst says 94% of bitcoin's price movement over the past four years can be explained by one equation

This is the best tl;dr I could make, original reduced by 73%. (I'm a bot)
The FundStrat cofounder Tom Lee says: "If you build a very simple model valuing bitcoin as the square function number of users times the average transaction value, 94% of the bitcoin movement over the past four years is explained by that equation." This model is based on Metcalfe's law, which says the value of a network is proportional to the square of the number of users on the network.
FundStrat looked at unique addresses as a proxy for users on the bitcoin network and found that the square of this value explained 63% of the variation in bitcoin prices since 2013.
So if you build a very simple model valuing bitcoin as the square function number of users times the average transaction value, 94% of the bitcoin movement over the past four years is explained by that equation.
This linear factor explained 83% of the variation in bitcoin's price.
The chart below plots the projected price of bitcoin based on this model against the actual price.
The price of bitcoin compared with the projected value of bitcoin.
Summary Source | FAQ | Feedback | Top keywords: bitcoin#1 value#2 model#3 network#4 FundStrat#5
Post found in /Bitcoin and /BitcoinAll.
NOTICE: This thread is for discussing the submission topic. Please do not discuss the concept of the autotldr bot here.
submitted by autotldr to autotldr [link] [comments]

Deploy Segwit Now! We need to increase Tx/block

Wallet authors, we need full Segwit now. Current segwit deployment is around 10%.
submitted by paulajohnson to Bitcoin [link] [comments]

What is the current comparison of bitcoin price to Metcalfe's Law?

About a year ago, right as the last bubble began to pop we saw a lot of comparisons between the price and Metcalfe's law, which had some eeries similarities.
This lead to further discussion that led many to predict hyper-bitcoinization, etc.
However I can't seem to find any updated figures. Obviously the chart has diverged, because bitcoin addresses / usage has undeniably been on the rise while the price has stagnated.
Where do we sit on charts like this currently?
Can anyone provide updated charts?
And here are some Bitcoin Talk links on the subject:
Please note that the idea of Metcalfe's Law has facinated me and I am wondering how the eventual disconnect will manifest. I am imagining the mother of all bubbles will emerge at some point, likely powered by the barriers for money flow disappearing while the eventual technical dominance becomes apparent.
submitted by americanpegasus to Bitcoin [link] [comments]

What Does Metcalfe’s Law Tell Us About Bitcoin? With Frank Holmes How to invest in Bitcoin using Metcalfe's Law. For smart HODLERS only. Metcalfe's Law and Uber $50,000 Bitcoin YES Metcalfe's Law for BITCOIN $1 million ...

Metcalfe's Law states that the value of a network is proportional to the square of the number of participants in the network. This chart plots a variant of the Law in which price is divided by n log n of the number of daily transactions. Values are scaled by 10 4. Information. Metcalfe's Law states that the value of a network is proportional to the square of the number of participants in the network.. This chart plots a variant of the Law in which price is divided by n log n of the number of unspent transaction outputs. Values are scaled by 10 6. Metcalfe's Law has been successfully used to value a variety of network effect technologies and businesses, including Facebook and Tencent. Applying Metcalfe's Law to Bitcoin, using "Daily Active Addresses" (DAA) as the "n" value, yields interesting results. Historically, Bitcoin has tracked the Metcalfe Law Fair Price reasonably well. A number of studies have been performed over recent years ... B. Generalized Metcalfe’s Law: NV ~ n^1.5. Metcalfe and UTXO. As mentioned above, Metcalfe’s Law states that the value of a network is proportional to the square of the number of participants in the network. The following chart exploits a variant of the law with BTCUSD divided by n log n of the number of unspent transaction outputs (UTXO). Values are also scaled by 106. Metcalfe UTXO ... Cryptocurrencies are a new asset class, and researchers have just started to understand better fundamental forces which are behind their price action. A new research paper shows that Bitcoin's price can be modeled by Metcalfe's Law. Bitcoin (and other cryptocurrencies) are in this characteristic very similar to Facebook as their value depends on the number of active users - network size ...

[index] [50318] [16070] [20239] [14277] [7005] [35664] [27453] [49921] [21341] [39729]

What Does Metcalfe’s Law Tell Us About Bitcoin? With Frank Holmes

Bitcoin Live Btc Price Chart Liquidation Watch Bull vs Bear Pump or Dump Bitcoin Currency Bitcoin is a cryptocurrency. It is a decentralized digital curr... The value of a network may grow as the square of the number of its users, according to Metcalfe's Law, but how does the number of a network's users grow over... 15 Year Old Forex Trader Reads Chart Like a Pro & Reveals His "Golden Zone" Trading System ... How to invest in Bitcoin using Metcalfe's Law. For smart HODLERS only. - Duration: 56:16. The Dang ... This video is unavailable. Watch Queue Queue. Watch Queue Queue Using Metcalfe's Law to calculate John McAfee's Bitcoin price prediction. Will bitcoin reach $50,000? $100,000? $1 million? not impossible