Bitcoin Miner Software

www.Bitcoin-pool.de The BCO, Rupee and many other coins since 2 months we have rebuilt our pool. New stable server with the latest software so that almost all algos

www.Bitcoin-pool.de The BCO, Rupee and many other coins since 2 months we have rebuilt our pool. New stable server with the latest software so that almost all algos submitted by Bitcoin-pool-de to u/Bitcoin-pool-de [link] [comments]

Dragonchain Great Reddit Scaling Bake-Off Public Proposal

Dragonchain Great Reddit Scaling Bake-Off Public Proposal

Dragonchain Public Proposal TL;DR:

Dragonchain has demonstrated twice Reddit’s entire total daily volume (votes, comments, and posts per Reddit 2019 Year in Review) in a 24-hour demo on an operational network. Every single transaction on Dragonchain is decentralized immediately through 5 levels of Dragon Net, and then secured with combined proof on Bitcoin, Ethereum, Ethereum Classic, and Binance Chain, via Interchain. At the time, in January 2020, the entire cost of the demo was approximately $25K on a single system (transaction fees locked at $0.0001/txn). With current fees (lowest fee $0.0000025/txn), this would cost as little as $625.
Watch Joe walk through the entire proposal and answer questions on YouTube.
This proposal is also available on the Dragonchain blog.

Hello Reddit and Ethereum community!

I’m Joe Roets, Founder & CEO of Dragonchain. When the team and I first heard about The Great Reddit Scaling Bake-Off we were intrigued. We believe we have the solutions Reddit seeks for its community points system and we have them at scale.
For your consideration, we have submitted our proposal below. The team at Dragonchain and I welcome and look forward to your technical questions, philosophical feedback, and fair criticism, to build a scaling solution for Reddit that will empower its users. Because our architecture is unlike other blockchain platforms out there today, we expect to receive many questions while people try to grasp our project. I will answer all questions here in this thread on Reddit, and I've answered some questions in the stream on YouTube.
We have seen good discussions so far in the competition. We hope that Reddit’s scaling solution will emerge from The Great Reddit Scaling Bake-Off and that Reddit will have great success with the implementation.

Executive summary

Dragonchain is a robust open source hybrid blockchain platform that has proven to withstand the passing of time since our inception in 2014. We have continued to evolve to harness the scalability of private nodes, yet take full advantage of the security of public decentralized networks, like Ethereum. We have a live, operational, and fully functional Interchain network integrating Bitcoin, Ethereum, Ethereum Classic, and ~700 independent Dragonchain nodes. Every transaction is secured to Ethereum, Bitcoin, and Ethereum Classic. Transactions are immediately usable on chain, and the first decentralization is seen within 20 seconds on Dragon Net. Security increases further to public networks ETH, BTC, and ETC within 10 minutes to 2 hours. Smart contracts can be written in any executable language, offering full freedom to existing developers. We invite any developer to watch the demo, play with our SDK’s, review open source code, and to help us move forward. Dragonchain specializes in scalable loyalty & rewards solutions and has built a decentralized social network on chain, with very affordable transaction costs. This experience can be combined with the insights Reddit and the Ethereum community have gained in the past couple of months to roll out the solution at a rapid pace.

Response and PoC

In The Great Reddit Scaling Bake-Off post, Reddit has asked for a series of demonstrations, requirements, and other considerations. In this section, we will attempt to answer all of these requests.

Live Demo

A live proof of concept showing hundreds of thousands of transactions
On Jan 7, 2020, Dragonchain hosted a 24-hour live demonstration during which a quarter of a billion (250 million+) transactions executed fully on an operational network. Every single transaction on Dragonchain is decentralized immediately through 5 levels of Dragon Net, and then secured with combined proof on Bitcoin, Ethereum, Ethereum Classic, and Binance Chain, via Interchain. This means that every single transaction is secured by, and traceable to these networks. An attack on this system would require a simultaneous attack on all of the Interchained networks.
24 hours in 4 minutes (YouTube):
24 hours in 4 minutes
The demonstration was of a single business system, and any user is able to scale this further, by running multiple systems simultaneously. Our goals for the event were to demonstrate a consistent capacity greater than that of Visa over an extended time period.
Tooling to reproduce our demo is available here:
https://github.com/dragonchain/spirit-bomb

Source Code

Source code (for on & off-chain components as well tooling used for the PoC). The source code does not have to be shared publicly, but if Reddit decides to use a particular solution it will need to be shared with Reddit at some point.

Scaling

How it works & scales

Architectural Scaling

Dragonchain’s architecture attacks the scalability issue from multiple angles. Dragonchain is a hybrid blockchain platform, wherein every transaction is protected on a business node to the requirements of that business or purpose. A business node may be held completely private or may be exposed or replicated to any level of exposure desired.
Every node has its own blockchain and is independently scalable. Dragonchain established Context Based Verification as its consensus model. Every transaction is immediately usable on a trust basis, and in time is provable to an increasing level of decentralized consensus. A transaction will have a level of decentralization to independently owned and deployed Dragonchain nodes (~700 nodes) within seconds, and full decentralization to BTC and ETH within minutes or hours. Level 5 nodes (Interchain nodes) function to secure all transactions to public or otherwise external chains such as Bitcoin and Ethereum. These nodes scale the system by aggregating multiple blocks into a single Interchain transaction on a cadence. This timing is configurable based upon average fees for each respective chain. For detailed information about Dragonchain’s architecture, and Context Based Verification, please refer to the Dragonchain Architecture Document.

Economic Scaling

An interesting feature of Dragonchain’s network consensus is its economics and scarcity model. Since Dragon Net nodes (L2-L4) are independent staking nodes, deployment to cloud platforms would allow any of these nodes to scale to take on a large percentage of the verification work. This is great for scalability, but not good for the economy, because there is no scarcity, and pricing would develop a downward spiral and result in fewer verification nodes. For this reason, Dragonchain uses TIME as scarcity.
TIME is calculated as the number of Dragons held, multiplied by the number of days held. TIME influences the user’s access to features within the Dragonchain ecosystem. It takes into account both the Dragon balance and length of time each Dragon is held. TIME is staked by users against every verification node and dictates how much of the transaction fees are awarded to each participating node for every block.
TIME also dictates the transaction fee itself for the business node. TIME is staked against a business node to set a deterministic transaction fee level (see transaction fee table below in Cost section). This is very interesting in a discussion about scaling because it guarantees independence for business implementation. No matter how much traffic appears on the entire network, a business is guaranteed to not see an increased transaction fee rate.

Scaled Deployment

Dragonchain uses Docker and Kubernetes to allow the use of best practices traditional system scaling. Dragonchain offers managed nodes with an easy to use web based console interface. The user may also deploy a Dragonchain node within their own datacenter or favorite cloud platform. Users have deployed Dragonchain nodes on-prem on Amazon AWS, Google Cloud, MS Azure, and other hosting platforms around the world. Any executable code, anything you can write, can be written into a smart contract. This flexibility is what allows us to say that developers with no blockchain experience can use any code language to access the benefits of blockchain. Customers have used NodeJS, Python, Java, and even BASH shell script to write smart contracts on Dragonchain.
With Docker containers, we achieve better separation of concerns, faster deployment, higher reliability, and lower response times.
We chose Kubernetes for its self-healing features, ability to run multiple services on one server, and its large and thriving development community. It is resilient, scalable, and automated. OpenFaaS allows us to package smart contracts as Docker images for easy deployment.
Contract deployment time is now bounded only by the size of the Docker image being deployed but remains fast even for reasonably large images. We also take advantage of Docker’s flexibility and its ability to support any language that can run on x86 architecture. Any image, public or private, can be run as a smart contract using Dragonchain.

Flexibility in Scaling

Dragonchain’s architecture considers interoperability and integration as key features. From inception, we had a goal to increase adoption via integration with real business use cases and traditional systems.
We envision the ability for Reddit, in the future, to be able to integrate alternate content storage platforms or other financial services along with the token.
  • LBRY - To allow users to deploy content natively to LBRY
  • MakerDAO to allow users to lend small amounts backed by their Reddit community points.
  • STORJ/SIA to allow decentralized on chain storage of portions of content. These integrations or any other are relatively easy to integrate on Dragonchain with an Interchain implementation.

Cost

Cost estimates (on-chain and off-chain) For the purpose of this proposal, we assume that all transactions are on chain (posts, replies, and votes).
On the Dragonchain network, transaction costs are deterministic/predictable. By staking TIME on the business node (as described above) Reddit can reduce transaction costs to as low as $0.0000025 per transaction.
Dragonchain Fees Table

Getting Started

How to run it
Building on Dragonchain is simple and requires no blockchain experience. Spin up a business node (L1) in our managed environment (AWS), run it in your own cloud environment, or on-prem in your own datacenter. Clear documentation will walk you through the steps of spinning up your first Dragonchain Level 1 Business node.
Getting started is easy...
  1. Download Dragonchain’s dctl
  2. Input three commands into a terminal
  3. Build an image
  4. Run it
More information can be found in our Get started documents.

Architecture
Dragonchain is an open source hybrid platform. Through Dragon Net, each chain combines the power of a public blockchain (like Ethereum) with the privacy of a private blockchain.
Dragonchain organizes its network into five separate levels. A Level 1, or business node, is a totally private blockchain only accessible through the use of public/private keypairs. All business logic, including smart contracts, can be executed on this node directly and added to the chain.
After creating a block, the Level 1 business node broadcasts a version stripped of sensitive private data to Dragon Net. Three Level 2 Validating nodes validate the transaction based on guidelines determined from the business. A Level 3 Diversity node checks that the level 2 nodes are from a diverse array of locations. A Level 4 Notary node, hosted by a KYC partner, then signs the validation record received from the Level 3 node. The transaction hash is ledgered to the Level 5 public chain to take advantage of the hash power of massive public networks.
Dragon Net can be thought of as a “blockchain of blockchains”, where every level is a complete private blockchain. Because an L1 can send to multiple nodes on a single level, proof of existence is distributed among many places in the network. Eventually, proof of existence reaches level 5 and is published on a public network.

API Documentation

APIs (on chain & off)

SDK Source

Nobody’s Perfect

Known issues or tradeoffs
  • Dragonchain is open source and even though the platform is easy enough for developers to code in any language they are comfortable with, we do not have so large a developer community as Ethereum. We would like to see the Ethereum developer community (and any other communities) become familiar with our SDK’s, our solutions, and our platform, to unlock the full potential of our Ethereum Interchain. Long ago we decided to prioritize both Bitcoin and Ethereum Interchains. We envision an ecosystem that encompasses different projects to give developers the ability to take full advantage of all the opportunities blockchain offers to create decentralized solutions not only for Reddit but for all of our current platforms and systems. We believe that together we will take the adoption of blockchain further. We currently have additional Interchain with Ethereum Classic. We look forward to Interchain with other blockchains in the future. We invite all blockchains projects who believe in decentralization and security to Interchain with Dragonchain.
  • While we only have 700 nodes compared to 8,000 Ethereum and 10,000 Bitcoin nodes. We harness those 18,000 nodes to scale to extremely high levels of security. See Dragonchain metrics.
  • Some may consider the centralization of Dragonchain’s business nodes as an issue at first glance, however, the model is by design to protect business data. We do not consider this a drawback as these nodes can make any, none, or all data public. Depending upon the implementation, every subreddit could have control of its own business node, for potential business and enterprise offerings, bringing new alternative revenue streams to Reddit.

Costs and resources

Summary of cost & resource information for both on-chain & off-chain components used in the PoC, as well as cost & resource estimates for further scaling. If your PoC is not on mainnet, make note of any mainnet caveats (such as congestion issues).
Every transaction on the PoC system had a transaction fee of $0.0001 (one-hundredth of a cent USD). At 256MM transactions, the demo cost $25,600. With current operational fees, the same demonstration would cost $640 USD.
For the demonstration, to achieve throughput to mimic a worldwide payments network, we modeled several clients in AWS and 4-5 business nodes to handle the traffic. The business nodes were tuned to handle higher throughput by adjusting memory and machine footprint on AWS. This flexibility is valuable to implementing a system such as envisioned by Reddit. Given that Reddit’s daily traffic (posts, replies, and votes) is less than half that of our demo, we would expect that the entire Reddit system could be handled on 2-5 business nodes using right-sized containers on AWS or similar environments.
Verification was accomplished on the operational Dragon Net network with over 700 independently owned verification nodes running around the world at no cost to the business other than paid transaction fees.

Requirements

Scaling

This PoC should scale to the numbers below with minimal costs (both on & off-chain). There should also be a clear path to supporting hundreds of millions of users.
Over a 5 day period, your scaling PoC should be able to handle:
*100,000 point claims (minting & distributing points) *25,000 subscriptions *75,000 one-off points burning *100,000 transfers
During Dragonchain’s 24 hour demo, the above required numbers were reached within the first few minutes.
Reddit’s total activity is 9000% more than Ethereum’s total transaction level. Even if you do not include votes, it is still 700% more than Ethereum’s current volume. Dragonchain has demonstrated that it can handle 250 million transactions a day, and it’s architecture allows for multiple systems to work at that level simultaneously. In our PoC, we demonstrate double the full capacity of Reddit, and every transaction was proven all the way to Bitcoin and Ethereum.
Reddit Scaling on Ethereum

Decentralization

Solutions should not depend on any single third-party provider. We prefer solutions that do not depend on specific entities such as Reddit or another provider, and solutions with no single point of control or failure in off-chain components but recognize there are numerous trade-offs to consider
Dragonchain’s architecture calls for a hybrid approach. Private business nodes hold the sensitive data while the validation and verification of transactions for the business are decentralized within seconds and secured to public blockchains within 10 minutes to 2 hours. Nodes could potentially be controlled by owners of individual subreddits for more organic decentralization.
  • Billing is currently centralized - there is a path to federation and decentralization of a scaled billing solution.
  • Operational multi-cloud
  • Operational on-premises capabilities
  • Operational deployment to any datacenter
  • Over 700 independent Community Verification Nodes with proof of ownership
  • Operational Interchain (Interoperable to Bitcoin, Ethereum, and Ethereum Classic, open to more)

Usability Scaling solutions should have a simple end user experience.

Users shouldn't have to maintain any extra state/proofs, regularly monitor activity, keep track of extra keys, or sign anything other than their normal transactions
Dragonchain and its customers have demonstrated extraordinary usability as a feature in many applications, where users do not need to know that the system is backed by a live blockchain. Lyceum is one of these examples, where the progress of academy courses is being tracked, and successful completion of courses is rewarded with certificates on chain. Our @Save_The_Tweet bot is popular on Twitter. When used with one of the following hashtags - #please, #blockchain, #ThankYou, or #eternalize the tweet is saved through Eternal to multiple blockchains. A proof report is available for future reference. Other examples in use are DEN, our decentralized social media platform, and our console, where users can track their node rewards, view their TIME, and operate a business node.
Examples:

Transactions complete in a reasonable amount of time (seconds or minutes, not hours or days)
All transactions are immediately usable on chain by the system. A transaction begins the path to decentralization at the conclusion of a 5-second block when it gets distributed across 5 separate community run nodes. Full decentralization occurs within 10 minutes to 2 hours depending on which interchain (Bitcoin, Ethereum, or Ethereum Classic) the transaction hits first. Within approximately 2 hours, the combined hash power of all interchained blockchains secures the transaction.

Free to use for end users (no gas fees, or fixed/minimal fees that Reddit can pay on their behalf)
With transaction pricing as low as $0.0000025 per transaction, it may be considered reasonable for Reddit to cover transaction fees for users.
All of Reddit's Transactions on Blockchain (month)
Community points can be earned by users and distributed directly to their Reddit account in batch (as per Reddit minting plan), and allow users to withdraw rewards to their Ethereum wallet whenever they wish. Withdrawal fees can be paid by either user or Reddit. This model has been operating inside the Dragonchain system since 2018, and many security and financial compliance features can be optionally added. We feel that this capability greatly enhances user experience because it is seamless to a regular user without cryptocurrency experience, yet flexible to a tech savvy user. With regard to currency or token transactions, these would occur on the Reddit network, verified to BTC and ETH. These transactions would incur the $0.0000025 transaction fee. To estimate this fee we use the monthly active Reddit users statista with a 60% adoption rate and an estimated 10 transactions per month average resulting in an approximate $720 cost across the system. Reddit could feasibly incur all associated internal network charges (mining/minting, transfer, burn) as these are very low and controllable fees.
Reddit Internal Token Transaction Fees

Reddit Ethereum Token Transaction Fees
When we consider further the Ethereum fees that might be incurred, we have a few choices for a solution.
  1. Offload all Ethereum transaction fees (user withdrawals) to interested users as they wish to withdraw tokens for external use or sale.
  2. Cover Ethereum transaction fees by aggregating them on a timed schedule. Users would request withdrawal (from Reddit or individual subreddits), and they would be transacted on the Ethereum network every hour (or some other schedule).
  3. In a combination of the above, customers could cover aggregated fees.
  4. Integrate with alternate Ethereum roll up solutions or other proposals to aggregate minting and distribution transactions onto Ethereum.

Bonus Points

Users should be able to view their balances & transactions via a blockchain explorer-style interface
From interfaces for users who have no knowledge of blockchain technology to users who are well versed in blockchain terms such as those present in a typical block explorer, a system powered by Dragonchain has flexibility on how to provide balances and transaction data to users. Transactions can be made viewable in an Eternal Proof Report, which displays raw data along with TIME staking information and traceability all the way to Bitcoin, Ethereum, and every other Interchained network. The report shows fields such as transaction ID, timestamp, block ID, multiple verifications, and Interchain proof. See example here.
Node payouts within the Dragonchain console are listed in chronological order and can be further seen in either Dragons or USD. See example here.
In our social media platform, Dragon Den, users can see, in real-time, their NRG and MTR balances. See example here.
A new influencer app powered by Dragonchain, Raiinmaker, breaks down data into a user friendly interface that shows coin portfolio, redeemed rewards, and social scores per campaign. See example here.

Exiting is fast & simple
Withdrawing funds on Dragonchain’s console requires three clicks, however, withdrawal scenarios with more enhanced security features per Reddit’s discretion are obtainable.

Interoperability Compatibility with third party apps (wallets/contracts/etc) is necessary.
Proven interoperability at scale that surpasses the required specifications. Our entire platform consists of interoperable blockchains connected to each other and traditional systems. APIs are well documented. Third party permissions are possible with a simple smart contract without the end user being aware. No need to learn any specialized proprietary language. Any code base (not subsets) is usable within a Docker container. Interoperable with any blockchain or traditional APIs. We’ve witnessed relatively complex systems built by engineers with no blockchain or cryptocurrency experience. We’ve also demonstrated the creation of smart contracts within minutes built with BASH shell and Node.js. Please see our source code and API documentation.

Scaling solutions should be extensible and allow third parties to build on top of it Open source and extensible
APIs should be well documented and stable

Documentation should be clear and complete
For full documentation, explore our docs, SDK’s, Github repo’s, architecture documents, original Disney documentation, and other links or resources provided in this proposal.

Third-party permissionless integrations should be possible & straightforward Smart contracts are Docker based, can be written in any language, use full language (not subsets), and can therefore be integrated with any system including traditional system APIs. Simple is better. Learning an uncommon or proprietary language should not be necessary.
Advanced knowledge of mathematics, cryptography, or L2 scaling should not be required. Compatibility with common utilities & toolchains is expected.
Dragonchain business nodes and smart contracts leverage Docker to allow the use of literally any language or executable code. No proprietary language is necessary. We’ve witnessed relatively complex systems built by engineers with no blockchain or cryptocurrency experience. We’ve also demonstrated the creation of smart contracts within minutes built with BASH shell and Node.js.

Bonus

Bonus Points: Show us how it works. Do you have an idea for a cool new use case for Community Points? Build it!

TIME

Community points could be awarded to Reddit users based upon TIME too, whereas the longer someone is part of a subreddit, the more community points someone naturally gained, even if not actively commenting or sharing new posts. A daily login could be required for these community points to be credited. This grants awards to readers too and incentivizes readers to create an account on Reddit if they browse the website often. This concept could also be leveraged to provide some level of reputation based upon duration and consistency of contribution to a community subreddit.

Dragon Den

Dragonchain has already built a social media platform that harnesses community involvement. Dragon Den is a decentralized community built on the Dragonchain blockchain platform. Dragon Den is Dragonchain’s answer to fake news, trolling, and censorship. It incentivizes the creation and evaluation of quality content within communities. It could be described as being a shareholder of a subreddit or Reddit in its entirety. The more your subreddit is thriving, the more rewarding it will be. Den is currently in a public beta and in active development, though the real token economy is not live yet. There are different tokens for various purposes. Two tokens are Lair Ownership Rights (LOR) and Lair Ownership Tokens (LOT). LOT is a non-fungible token for ownership of a specific Lair. LOT will only be created and converted from LOR.
Energy (NRG) and Matter (MTR) work jointly. Your MTR determines how much NRG you receive in a 24-hour period. Providing quality content, or evaluating content will earn MTR.

Security. Users have full ownership & control of their points.
All community points awarded based upon any type of activity or gift, are secured and provable to all Interchain networks (currently BTC, ETH, ETC). Users are free to spend and withdraw their points as they please, depending on the features Reddit wants to bring into production.

Balances and transactions cannot be forged, manipulated, or blocked by Reddit or anyone else
Users can withdraw their balance to their ERC20 wallet, directly through Reddit. Reddit can cover the fees on their behalf, or the user covers this with a portion of their balance.

Users should own their points and be able to get on-chain ERC20 tokens without permission from anyone else
Through our console users can withdraw their ERC20 rewards. This can be achieved on Reddit too. Here is a walkthrough of our console, though this does not show the quick withdrawal functionality, a user can withdraw at any time. https://www.youtube.com/watch?v=aNlTMxnfVHw

Points should be recoverable to on-chain ERC20 tokens even if all third-parties involved go offline
If necessary, signed transactions from the Reddit system (e.g. Reddit + Subreddit) can be sent to the Ethereum smart contract for minting.

A public, third-party review attesting to the soundness of the design should be available
To our knowledge, at least two large corporations, including a top 3 accounting firm, have conducted positive reviews. These reviews have never been made public, as Dragonchain did not pay or contract for these studies to be released.

Bonus points
Public, third-party implementation review available or in progress
See above

Compatibility with HSMs & hardware wallets
For the purpose of this proposal, all tokenization would be on the Ethereum network using standard token contracts and as such, would be able to leverage all hardware wallet and Ethereum ecosystem services.

Other Considerations

Minting/distributing tokens is not performed by Reddit directly
This operation can be automated by smart contract on Ethereum. Subreddits can if desired have a role to play.

One off point burning, as well as recurring, non-interactive point burning (for subreddit memberships) should be possible and scalable
This is possible and scalable with interaction between Dragonchain Reddit system and Ethereum token contract(s).

Fully open-source solutions are strongly preferred
Dragonchain is fully open source (see section on Disney release after conclusion).

Conclusion

Whether it is today, or in the future, we would like to work together to bring secure flexibility to the highest standards. It is our hope to be considered by Ethereum, Reddit, and other integrative solutions so we may further discuss the possibilities of implementation. In our public demonstration, 256 million transactions were handled in our operational network on chain in 24 hours, for the low cost of $25K, which if run today would cost $625. Dragonchain’s interoperable foundation provides the atmosphere necessary to implement a frictionless community points system. Thank you for your consideration of our proposal. We look forward to working with the community to make something great!

Disney Releases Blockchain Platform as Open Source

The team at Disney created the Disney Private Blockchain Platform. The system was a hybrid interoperable blockchain platform for ledgering and smart contract development geared toward solving problems with blockchain adoption and usability. All objective evaluation would consider the team’s output a success. We released a list of use cases that we explored in some capacity at Disney, and our input on blockchain standardization as part of our participation in the W3C Blockchain Community Group.
https://lists.w3.org/Archives/Public/public-blockchain/2016May/0052.html

Open Source

In 2016, Roets proposed to release the platform as open source to spread the technology outside of Disney, as others within the W3C group were interested in the solutions that had been created inside of Disney.
Following a long process, step by step, the team met requirements for release. Among the requirements, the team had to:
  • Obtain VP support and approval for the release
  • Verify ownership of the software to be released
  • Verify that no proprietary content would be released
  • Convince the organization that there was a value to the open source community
  • Convince the organization that there was a value to Disney
  • Offer the plan for ongoing maintenance of the project outside of Disney
  • Itemize competing projects
  • Verify no conflict of interest
  • Preferred license
  • Change the project name to not use the name Disney, any Disney character, or any other associated IP - proposed Dragonchain - approved
  • Obtain legal approval
  • Approval from corporate, parks, and other business units
  • Approval from multiple Disney patent groups Copyright holder defined by Disney (Disney Connected and Advanced Technologies)
  • Trademark searches conducted for the selected name Dragonchain
  • Obtain IT security approval
  • Manual review of OSS components conducted
  • OWASP Dependency and Vulnerability Check Conducted
  • Obtain technical (software) approval
  • Offer management, process, and financial plans for the maintenance of the project.
  • Meet list of items to be addressed before release
  • Remove all Disney project references and scripts
  • Create a public distribution list for email communications
  • Remove Roets’ direct and internal contact information
  • Create public Slack channel and move from Disney slack channels
  • Create proper labels for issue tracking
  • Rename internal private Github repository
  • Add informative description to Github page
  • Expand README.md with more specific information
  • Add information beyond current “Blockchains are Magic”
  • Add getting started sections and info on cloning/forking the project
  • Add installation details
  • Add uninstall process
  • Add unit, functional, and integration test information
  • Detail how to contribute and get involved
  • Describe the git workflow that the project will use
  • Move to public, non-Disney git repository (Github or Bitbucket)
  • Obtain Disney Open Source Committee approval for release
On top of meeting the above criteria, as part of the process, the maintainer of the project had to receive the codebase on their own personal email and create accounts for maintenance (e.g. Github) with non-Disney accounts. Given the fact that the project spanned multiple business units, Roets was individually responsible for its ongoing maintenance. Because of this, he proposed in the open source application to create a non-profit organization to hold the IP and maintain the project. This was approved by Disney.
The Disney Open Source Committee approved the application known as OSSRELEASE-10, and the code was released on October 2, 2016. Disney decided to not issue a press release.
Original OSSRELASE-10 document

Dragonchain Foundation

The Dragonchain Foundation was created on January 17, 2017. https://den.social/l/Dragonchain/24130078352e485d96d2125082151cf0/dragonchain-and-disney/
submitted by j0j0r0 to ethereum [link] [comments]

Technical: Taproot: Why Activate?

This is a follow-up on https://old.reddit.com/Bitcoin/comments/hqzp14/technical_the_path_to_taproot_activation/
Taproot! Everybody wants it!! But... you might ask yourself: sure, everybody else wants it, but why would I, sovereign Bitcoin HODLer, want it? Surely I can be better than everybody else because I swapped XXX fiat for Bitcoin unlike all those nocoiners?
And it is important for you to know the reasons why you, o sovereign Bitcoiner, would want Taproot activated. After all, your nodes (or the nodes your wallets use, which if you are SPV, you hopefully can pester to your wallet vendoimplementor about) need to be upgraded in order for Taproot activation to actually succeed instead of becoming a hot sticky mess.
First, let's consider some principles of Bitcoin.
I'm sure most of us here would agree that the above are very important principles of Bitcoin and that these are principles we would not be willing to remove. If anything, we would want those principles strengthened (especially the last one, financial privacy, which current Bitcoin is only sporadically strong with: you can get privacy, it just requires effort to do so).
So, how does Taproot affect those principles?

Taproot and Your /Coins

Most HODLers probably HODL their coins in singlesig addresses. Sadly, switching to Taproot would do very little for you (it gives a mild discount at spend time, at the cost of a mild increase in fee at receive time (paid by whoever sends to you, so if it's a self-send from a P2PKH or bech32 address, you pay for this); mostly a wash).
(technical details: a Taproot output is 1 version byte + 32 byte public key, while a P2WPKH (bech32 singlesig) output is 1 version byte + 20 byte public key hash, so the Taproot output spends 12 bytes more; spending from a P2WPKH requires revealing a 32-byte public key later, which is not needed with Taproot, and Taproot signatures are about 9 bytes smaller than P2WPKH signatures, but the 32 bytes plus 9 bytes is divided by 4 because of the witness discount, so it saves about 11 bytes; mostly a wash, it increases blockweight by about 1 virtual byte, 4 weight for each Taproot-output-input, compared to P2WPKH-output-input).
However, as your HODLings grow in value, you might start wondering if multisignature k-of-n setups might be better for the security of your savings. And it is in multisignature that Taproot starts to give benefits!
Taproot switches to using Schnorr signing scheme. Schnorr makes key aggregation -- constructing a single public key from multiple public keys -- almost as trivial as adding numbers together. "Almost" because it involves some fairly advanced math instead of simple boring number adding, but hey when was the last time you added up your grocery list prices by hand huh?
With current P2SH and P2WSH multisignature schemes, if you have a 2-of-3 setup, then to spend, you need to provide two different signatures from two different public keys. With Taproot, you can create, using special moon math, a single public key that represents your 2-of-3 setup. Then you just put two of your devices together, have them communicate to each other (this can be done airgapped, in theory, by sending QR codes: the software to do this is not even being built yet, but that's because Taproot hasn't activated yet!), and they will make a single signature to authorize any spend from your 2-of-3 address. That's 73 witness bytes -- 18.25 virtual bytes -- of signatures you save!
And if you decide that your current setup with 1-of-1 P2PKH / P2WPKH addresses is just fine as-is: well, that's the whole point of a softfork: backwards-compatibility; you can receive from Taproot users just fine, and once your wallet is updated for Taproot-sending support, you can send to Taproot users just fine as well!
(P2WPKH and P2WSH -- SegWit v0 -- addresses start with bc1q; Taproot -- SegWit v1 --- addresses start with bc1p, in case you wanted to know the difference; in bech32 q is 0, p is 1)
Now how about HODLers who keep all, or some, of their coins on custodial services? Well, any custodial service worth its salt would be doing at least 2-of-3, or probably something even bigger, like 11-of-15. So your custodial service, if it switched to using Taproot internally, could save a lot more (imagine an 11-of-15 getting reduced from 11 signatures to just 1!), which --- we can only hope! --- should translate to lower fees and better customer service from your custodial service!
So I think we can say, very accurately, that the Bitcoin principle --- that YOU are in control of your money --- can only be helped by Taproot (if you are doing multisignature), and, because P2PKH and P2WPKH remain validly-usable addresses in a Taproot future, will not be harmed by Taproot. Its benefit to this principle might be small (it mostly only benefits multisignature users) but since it has no drawbacks with this (i.e. singlesig users can continue to use P2WPKH and P2PKH still) this is still a nice, tidy win!
(even singlesig users get a minor benefit, in that multisig users will now reduce their blockchain space footprint, so that fees can be kept low for everybody; so for example even if you have your single set of private keys engraved on titanium plates sealed in an airtight box stored in a safe buried in a desert protected by angry nomads riding giant sandworms because you're the frickin' Kwisatz Haderach, you still gain some benefit from Taproot)
And here's the important part: if P2PKH/P2WPKH is working perfectly fine with you and you decide to never use Taproot yourself, Taproot will not affect you detrimentally. First do no harm!

Taproot and Your Contracts

No one is an island, no one lives alone. Give and you shall receive. You know: by trading with other people, you can gain expertise in some obscure little necessity of the world (and greatly increase your productivity in that little field), and then trade the products of your expertise for necessities other people have created, all of you thereby gaining gains from trade.
So, contracts, which are basically enforceable agreements that facilitate trading with people who you do not personally know and therefore might not trust.
Let's start with a simple example. You want to buy some gewgaws from somebody. But you don't know them personally. The seller wants the money, you want their gewgaws, but because of the lack of trust (you don't know them!! what if they're scammers??) neither of you can benefit from gains from trade.
However, suppose both of you know of some entity that both of you trust. That entity can act as a trusted escrow. The entity provides you security: this enables the trade, allowing both of you to get gains from trade.
In Bitcoin-land, this can be implemented as a 2-of-3 multisignature. The three signatories in the multisgnature would be you, the gewgaw seller, and the escrow. You put the payment for the gewgaws into this 2-of-3 multisignature address.
Now, suppose it turns out neither of you are scammers (whaaaat!). You receive the gewgaws just fine and you're willing to pay up for them. Then you and the gewgaw seller just sign a transaction --- you and the gewgaw seller are 2, sufficient to trigger the 2-of-3 --- that spends from the 2-of-3 address to a singlesig the gewgaw seller wants (or whatever address the gewgaw seller wants).
But suppose some problem arises. The seller gave you gawgews instead of gewgaws. Or you decided to keep the gewgaws but not sign the transaction to release the funds to the seller. In either case, the escrow is notified, and if it can sign with you to refund the funds back to you (if the seller was a scammer) or it can sign with the seller to forward the funds to the seller (if you were a scammer).
Taproot helps with this: like mentioned above, it allows multisignature setups to produce only one signature, reducing blockchain space usage, and thus making contracts --- which require multiple people, by definition, you don't make contracts with yourself --- is made cheaper (which we hope enables more of these setups to happen for more gains from trade for everyone, also, moon and lambos).
(technology-wise, it's easier to make an n-of-n than a k-of-n, making a k-of-n would require a complex setup involving a long ritual with many communication rounds between the n participants, but an n-of-n can be done trivially with some moon math. You can, however, make what is effectively a 2-of-3 by using a three-branch SCRIPT: either 2-of-2 of you and seller, OR 2-of-2 of you and escrow, OR 2-of-2 of escrow and seller. Fortunately, Taproot adds a facility to embed a SCRIPT inside a public key, so you can have a 2-of-2 Taprooted address (between you and seller) with a SCRIPT branch that can instead be spent with 2-of-2 (you + escrow) OR 2-of-2 (seller + escrow), which implements the three-branched SCRIPT above. If neither of you are scammers (hopefully the common case) then you both sign using your keys and never have to contact the escrow, since you are just using the escrow public key without coordinating with them (because n-of-n is trivial but k-of-n requires setup with communication rounds), so in the "best case" where both of you are honest traders, you also get a privacy boost, in that the escrow never learns you have been trading on gewgaws, I mean ewww, gawgews are much better than gewgaws and therefore I now judge you for being a gewgaw enthusiast, you filthy gewgawer).

Taproot and Your Contracts, Part 2: Cryptographic Boogaloo

Now suppose you want to buy some data instead of things. For example, maybe you have some closed-source software in trial mode installed, and want to pay the developer for the full version. You want to pay for an activation code.
This can be done, today, by using an HTLC. The developer tells you the hash of the activation code. You pay to an HTLC, paying out to the developer if it reveals the preimage (the activation code), or refunding the money back to you after a pre-agreed timeout. If the developer claims the funds, it has to reveal the preimage, which is the activation code, and you can now activate your software. If the developer does not claim the funds by the timeout, you get refunded.
And you can do that, with HTLCs, today.
Of course, HTLCs do have problems:
Fortunately, with Schnorr (which is enabled by Taproot), we can now use the Scriptless Script constuction by Andrew Poelstra. This Scriptless Script allows a new construction, the PTLC or Pointlocked Timelocked Contract. Instead of hashes and preimages, just replace "hash" with "point" and "preimage" with "scalar".
Or as you might know them: "point" is really "public key" and "scalar" is really a "private key". What a PTLC does is that, given a particular public key, the pointlocked branch can be spent only if the spender reveals the private key of the given public key to you.
Another nice thing with PTLCs is that they are deniable. What appears onchain is just a single 2-of-2 signature between you and the developemanufacturer. It's like a magic trick. This signature has no special watermarks, it's a perfectly normal signature (the pledge). However, from this signature, plus some datta given to you by the developemanufacturer (known as the adaptor signature) you can derive the private key of a particular public key you both agree on (the turn). Anyone scraping the blockchain will just see signatures that look just like every other signature, and as long as nobody manages to hack you and get a copy of the adaptor signature or the private key, they cannot get the private key behind the public key (point) that the pointlocked branch needs (the prestige).
(Just to be clear, the public key you are getting the private key from, is distinct from the public key that the developemanufacturer will use for its funds. The activation key is different from the developer's onchain Bitcoin key, and it is the activation key whose private key you will be learning, not the developer's/manufacturer's onchain Bitcoin key).
So:
Taproot lets PTLCs exist onchain because they enable Schnorr, which is a requirement of PTLCs / Scriptless Script.
(technology-wise, take note that Scriptless Script works only for the "pointlocked" branch of the contract; you need normal Script, or a pre-signed nLockTimed transaction, for the "timelocked" branch. Since Taproot can embed a script, you can have the Taproot pubkey be a 2-of-2 to implement the Scriptless Script "pointlocked" branch, then have a hidden script that lets you recover the funds with an OP_CHECKLOCKTIMEVERIFY after the timeout if the seller does not claim the funds.)

Quantum Quibbles!

Now if you were really paying attention, you might have noticed this parenthetical:
(technical details: a Taproot output is 1 version byte + 32 byte public key, while a P2WPKH (bech32 singlesig) output is 1 version byte + 20 byte public key hash...)
So wait, Taproot uses raw 32-byte public keys, and not public key hashes? Isn't that more quantum-vulnerable??
Well, in theory yes. In practice, they probably are not.
It's not that hashes can be broken by quantum computes --- they're still not. Instead, you have to look at how you spend from a P2WPKH/P2PKH pay-to-public-key-hash.
When you spend from a P2PKH / P2WPKH, you have to reveal the public key. Then Bitcoin hashes it and checks if this matches with the public-key-hash, and only then actually validates the signature for that public key.
So an unconfirmed transaction, floating in the mempools of nodes globally, will show, in plain sight for everyone to see, your public key.
(public keys should be public, that's why they're called public keys, LOL)
And if quantum computers are fast enough to be of concern, then they are probably fast enough that, in the several minutes to several hours from broadcast to confirmation, they have already cracked the public key that is openly broadcast with your transaction. The owner of the quantum computer can now replace your unconfirmed transaction with one that pays the funds to itself. Even if you did not opt-in RBF, miners are still incentivized to support RBF on RBF-disabled transactions.
So the extra hash is not as significant a protection against quantum computers as you might think. Instead, the extra hash-and-compare needed is just extra validation effort.
Further, if you have ever, in the past, spent from the address, then there exists already a transaction indelibly stored on the blockchain, openly displaying the public key from which quantum computers can derive the private key. So those are still vulnerable to quantum computers.
For the most part, the cryptographers behind Taproot (and Bitcoin Core) are of the opinion that quantum computers capable of cracking Bitcoin pubkeys are unlikely to appear within a decade or two.
So:
For now, the homomorphic and linear properties of elliptic curve cryptography provide a lot of benefits --- particularly the linearity property is what enables Scriptless Script and simple multisignature (i.e. multisignatures that are just 1 signature onchain). So it might be a good idea to take advantage of them now while we are still fairly safe against quantum computers. It seems likely that quantum-safe signature schemes are nonlinear (thus losing these advantages).

Summary

I Wanna Be The Taprooter!

So, do you want to help activate Taproot? Here's what you, mister sovereign Bitcoin HODLer, can do!

But I Hate Taproot!!

That's fine!

Discussions About Taproot Activation

submitted by almkglor to Bitcoin [link] [comments]

Subreddit Stats: programming top posts from 2019-10-22 to 2020-10-21 06:41 PDT

Period: 364.67 days
Submissions Comments
Total 1000 180545
Rate (per day) 2.74 491.84
Unique Redditors 629 34951
Combined Score 1178903 2688497

Top Submitters' Top Submissions

  1. 47468 points, 49 submissions: iamkeyur
    1. One Guy Ruined Hacktoberfest 2020 (3039 points, 584 comments)
    2. AWS forked my project and launched it as its own service (2956 points, 810 comments)
    3. Privacy analysis of Tiktok’s app and website (2858 points, 234 comments)
    4. 98.css – design system for building faithful recreations of Windows 98 UIs (2781 points, 318 comments)
    5. Microsoft demos language model that writes code based on signature and comment (2621 points, 614 comments)
    6. Why does HTML think “chucknorris” is a color? (2565 points, 531 comments)
    7. Windows 95 UI Design (2309 points, 665 comments)
    8. The Linux codebase has over 3k TODO comments, many from over a decade ago (2119 points, 369 comments)
    9. eBay is port scanning visitors to their website (1829 points, 236 comments)
    10. Using const/let instead of var can make JavaScript code run 10× slower in Webkit (1814 points, 525 comments)
  2. 44853 points, 28 submissions: speckz
    1. From August, Chrome will start blocking ads that consume 4MB of network data, 15 seconds of CPU usage in any 30 second period, or 60 seconds of total CPU usage (8434 points, 590 comments)
    2. How To Spot Toxic Software Jobs From Their Descriptions (6246 points, 1281 comments)
    3. A Facebook crawler was making 7M requests per day to my stupid website (2662 points, 426 comments)
    4. Apple, Your Developer Documentation is Garbage (2128 points, 432 comments)
    5. The code I’m still ashamed of (2016) (2105 points, 429 comments)
    6. Slack Is Fumbling Developers And The Rise Of Developer Discords (2095 points, 811 comments)
    7. The Chromium project finds that around 70% of our serious security bugs are memory safety problems. Our next major project is to prevent such bugs at source. (1959 points, 418 comments)
    8. Advice to Myself When Starting Out as a Software Developer (1934 points, 257 comments)
    9. Software patents are another kind of disease (1893 points, 419 comments)
    10. My favourite Git commit (1772 points, 206 comments)
  3. 35237 points, 28 submissions: whackri
    1. It is perfectly OK to only code at work, you can have a life too (6765 points, 756 comments)
    2. Kernighan's Law - Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. (5171 points, 437 comments)
    3. The entire Apollo 11 computer code that helped get us to the Moon is available on github. (3841 points, 433 comments)
    4. Raytracing - in Excel! (2478 points, 168 comments)
    5. Writing userspace USB drivers for abandoned devices (1689 points, 84 comments)
    6. Drum Machine in Excel (1609 points, 60 comments)
    7. fork() can fail: this is important (1591 points, 264 comments)
    8. Learn how computers add numbers and build a 4 bit adder circuit (1548 points, 66 comments)
    9. Heroes Of Might And Magic III engine written from scratch (open source, playable) (1453 points, 84 comments)
    10. Apollo Guidance Computer: Restoring the computer that put man on the Moon (1277 points, 47 comments)
  4. 14588 points, 11 submissions: pimterry
    1. I'm a software engineer going blind, how should I prepare? (4237 points, 351 comments)
    2. The 2038 problem is already affecting some systems (1988 points, 518 comments)
    3. TLDR pages: Simplified, community-driven man pages (1897 points, 182 comments)
    4. JetBrains Mono: A Typeface for Developers (1728 points, 456 comments)
    5. BlurHash: extremely compact representations of image placeholders (930 points, 159 comments)
    6. Let's Destroy C (855 points, 290 comments)
    7. Shared Cache is Going Away (833 points, 192 comments)
    8. XML is almost always misused (766 points, 538 comments)
    9. Wireshark has a new packet diagram view (688 points, 24 comments)
    10. fork() can fail: this is important (460 points, 299 comments)
  5. 14578 points, 9 submissions: magenta_placenta
    1. Trello handed over user's personal account to user's previous company (2962 points, 489 comments)
    2. Feds: IBM did discriminate against older workers in making layoffs - “Analysis shows it was primarily older workers (85.85%) in the total potential pool of those considered for layoff,” the EEOC wrote (2809 points, 509 comments)
    3. Stripe Workers Who Relocate Get $20,000 Bonus and a Pay Cut - Stripe Inc. plans to make a one-time payment of $20,000 to employees who opt to move out of San Francisco, New York or Seattle, but also cut their base salary by as much as 10% (2765 points, 989 comments)
    4. US court fully legalized website scraping and technically prohibited it - On September 9, the U.S. 9th circuit court of Appeals ruled that web scraping public sites does not violate the CFAA (Computer Fraud and Abuse Act) (2014 points, 327 comments)
    5. I Suspect many Task Deadlines are Designed to Force Engineers to Work for Free (1999 points, 553 comments)
    6. Intent to Deprecate and Freeze: The User-Agent string (1012 points, 271 comments)
    7. Contractor admits planting logic bombs in his software to ensure he’d get new work (399 points, 182 comments)
    8. AlphaStar: Grandmaster level in StarCraft II using multi-agent reinforcement learning (396 points, 97 comments)
    9. Half of the websites using WebAssembly use it for malicious purposes - WebAssembly not that popular: Only 1,639 sites of the Top 1 Million use WebAssembly (222 points, 133 comments)
  6. 13750 points, 3 submissions: pedrovhb
    1. Bubble sort visualization (7218 points, 276 comments)
    2. Breadth-first search visualization (3874 points, 96 comments)
    3. Selection sort visualization (2658 points, 80 comments)
  7. 11833 points, 1 submission: flaming_bird
    1. 20GB leak of Intel data: whole Git repositories, dev tools, backdoor mentions in source code (11833 points, 956 comments)
  8. 11208 points, 10 submissions: PowerOfLove1985
    1. No cookie consent walls — and no, scrolling isn’t consent, says EU data protection body (5975 points, 890 comments)
    2. Redesigning uBlock Origin (1184 points, 162 comments)
    3. Playing Around With The Fuchsia Operating System (696 points, 164 comments)
    4. Microsoft's underwater data centre resurfaces after two years (623 points, 199 comments)
    5. Microsoft Paint/Paintbrush in Javascript (490 points, 58 comments)
    6. GitHub shuts off access to Aurelia repository, citing trade sanctions (478 points, 81 comments)
    7. How 3D Game Rendering Works: Texturing (475 points, 22 comments)
    8. Simdjson: Parsing Gigabytes of JSON per Second (441 points, 90 comments)
    9. How 1500 bytes became the MTU of the internet (435 points, 60 comments)
    10. It’s OK for your open source library to be a bit shitty (411 points, 130 comments)
  9. 10635 points, 8 submissions: michalg82
    1. Turning animations to 60fps using AI (3449 points, 234 comments)
    2. Bug #1463112 “Cat sitting on keyboard crashes lightdm” (3150 points, 143 comments)
    3. Heroes Of Might And Magic III engine written from scratch (open source, playable) (1431 points, 172 comments)
    4. Vulkan is coming to Raspberry Pi: first triangle - Raspberry Pi (1318 points, 66 comments)
    5. An EPYC trip to Rome: AMD is Cloudflare's 10th-generation Edge server CPU (431 points, 60 comments)
    6. Microsoft cancels GDC 2020 presence due to coronavirus concerns (Following Sony, Facebook, Kojima Productions, Epic Games, Unity, and more) (371 points, 52 comments)
    7. Moving from reCAPTCHA to hCaptcha - The Cloudflare Blog (278 points, 71 comments)
    8. How much of a genius-level move was using binary space partitioning in Doom? (207 points, 109 comments)
  10. 10106 points, 10 submissions: SerenityOS
    1. Someone suggested I should host my website on my own OS. For that we'll need a web server, so here's me building a basic web server in C++ for SerenityOS! (2269 points, 149 comments)
    2. I've been learning about OS security lately. Here's me making a local root exploit for SerenityOS, and then fixing the kernel bugs that made it possible! (1372 points, 87 comments)
    3. SerenityOS was hacked in a 36c3 CTF! (Exploit and write-up) (1236 points, 40 comments)
    4. One week ago, I started building a JavaScript engine for SerenityOS. Here’s me integrating it with the web browser and adding some simple API’s like alert()! (1169 points, 63 comments)
    5. Implementing macOS-style "purgeable memory" in my kernel. This technique is amazing and helps apps be better memory usage citizens! (1131 points, 113 comments)
    6. SerenityOS: The second year (900 points, 101 comments)
    7. Using my own C++ IDE to make a little program for decorating my webcam frame (571 points, 33 comments)
    8. This morning I ported git to SerenityOS. It took about an hour and some hacks, but it works! :D (547 points, 64 comments)
    9. Smarter C/C++ inlining with attribute((flatten)) (521 points, 118 comments)
    10. Introduction to SerenityOS GUI programming (390 points, 45 comments)

Top Commenters

  1. XANi_ (10753 points, 821 comments)
  2. dnew (7513 points, 641 comments)
  3. drysart (7479 points, 202 comments)
  4. MuonManLaserJab (6666 points, 233 comments)
  5. SanityInAnarchy (6331 points, 350 comments)
  6. AngularBeginner (6215 points, 59 comments)
  7. SerenityOS (5627 points, 128 comments)
  8. chucker23n (5465 points, 370 comments)
  9. IshKebab (4898 points, 393 comments)
  10. L3tum (4857 points, 199 comments)

Top Submissions

  1. 20GB leak of Intel data: whole Git repositories, dev tools, backdoor mentions in source code by flaming_bird (11833 points, 956 comments)
  2. hentAI: Detecting and removing censors with Deep Learning and Image Segmentation by 7cmStrangler (9621 points, 395 comments)
  3. US Politicians Want to Ban End-to-End Encryption by CarrotRobber (9427 points, 523 comments)
  4. From August, Chrome will start blocking ads that consume 4MB of network data, 15 seconds of CPU usage in any 30 second period, or 60 seconds of total CPU usage by speckz (8434 points, 590 comments)
  5. Mozilla: The Greatest Tech Company Left Behind by matthewpmacdonald (7566 points, 1087 comments)
  6. Bubble sort visualization by pedrovhb (7218 points, 276 comments)
  7. During lockdown my wife has been suffering mentally from pressure to stay at her desk 100% of the time otherwise after a few minutes her laptop locks and she is recorded as inactive. I wrote this small app to help her escape her desk by periodically moving the cursor. Hopefully it can help others. by silitbang6000 (7193 points, 855 comments)
  8. It is perfectly OK to only code at work, you can have a life too by whackri (6765 points, 756 comments)
  9. Blockchain, the amazing solution for almost nothing by imogenchampagne (6725 points, 1561 comments)
  10. Blockchain, the amazing solution for almost nothing by jessefrederik (6524 points, 1572 comments)

Top Comments

  1. 2975 points: deleted's comment in hentAI: Detecting and removing censors with Deep Learning and Image Segmentation
  2. 2772 points: I_DONT_LIE_MUCH's comment in 20GB leak of Intel data: whole Git repositories, dev tools, backdoor mentions in source code
  3. 2485 points: api's comment in Stripe Workers Who Relocate Get $20,000 Bonus and a Pay Cut - Stripe Inc. plans to make a one-time payment of $20,000 to employees who opt to move out of San Francisco, New York or Seattle, but also cut their base salary by as much as 10%
  4. 2484 points: a_false_vacuum's comment in Stack Overflow lays off 15%
  5. 2464 points: iloveparagon's comment in Google engineer breaks down the problems he uses when doing technical interviews. Lots of advice on algorithms and programming.
  6. 2384 points: why_not_both_bot's comment in During lockdown my wife has been suffering mentally from pressure to stay at her desk 100% of the time otherwise after a few minutes her laptop locks and she is recorded as inactive. I wrote this small app to help her escape her desk by periodically moving the cursor. Hopefully it can help others.
  7. 2293 points: ThatInternetGuy's comment in Iranian Maintainer refuses to merge code from Israeli Developer. Cites Iranian regulations.
  8. 2268 points: xequae's comment in I'm a software engineer going blind, how should I prepare?
  9. 2228 points: turniphat's comment in AWS forked my project and launched it as its own service
  10. 2149 points: Rami-Slicer's comment in 20GB leak of Intel data: whole Git repositories, dev tools, backdoor mentions in source code
Generated with BBoe's Subreddit Stats
submitted by flpezet to subreddit_stats [link] [comments]

NEAR PROJECT REPORT

NEAR PROJECT REPORT
Author: Gamals Ahmed, CoinEx Business Ambassador
https://preview.redd.it/xbnvecjn71t51.png?width=1164&format=png&auto=webp&s=acfd141ead035ee156f218eec9fc41288142a922

ABSTRACT

The effects of the web by a number of companies have seduced a large number of users as these companies keep their data to prevent them from searching for alternatives. Likewise, these huge platforms have attracted applications to build their highest ecosystems before either severing access or actively opposing their interests when the applications became so successful. As a result, these walled gardens have effectively hindered innovation and monopolized large sections of the web. After the emergence of blockchain technology and decentralized cryptocurrencies, the need for applications to support decentralization has emerged. Several blockchain-based companies, applications and platforms have appeared in decentralization. In this research report, we will explain the approach adopted by the NEAR decentralization platform in designing and implementing the basic technology for its system. Near is a basic platform for cloud computing and decentralized storage managed by the community, designed to enable the open web for the future. On this web, everything can be created from new currencies to new applications to new industries, opening the door to an entirely new future.

1. INTRODUCTION

The richness of the web is increasing day by day with the combined efforts of millions of people who have benefited from “innovation without permission” as content and applications are created without asking anyone. this lack of freedom of data has led to an environment hostile to the interests of its participants. And as we explained in the summary previously, web hosting companies have hindered innovation and greatly monopolized the web.
In the future, we can fix this by using new technologies to re-enable the permissionless innovation of the past in a way, which creates a more open web where users are free and applications are supportive rather than adversarial to their interests.
Decentralization emerged after the global financial crisis in 2008, which created fundamental problems of confidence in the heavily indebted banking system. Then the decentralized financial sector based on Blockchain technology has emerged since 2009.
Decentralized Blockchain technology has made it easy for decentralized digital currencies like Bitcoin to exchange billions of dollars in peer-to-peer transfers for a fraction of the price of a traditional banking system. This technology allows participants in the over $ 50 billion virtual goods economy to track, own and trade in these commodities without permission. It allows real-world goods to cross into the digital domain, with verified ownership and tracking just like that of the digital.
By default, the Internet where freedom of data enables innovation will lead to the development of a new form of software development. On this web, developers can quickly create applications from open state components and boost their efforts by using new business models that are enabled from within the program itself rather than relying on parasitic relationships with their users. This not only accelerates the creation of applications that have a more honest and cooperative relationship with its users, but also allows the emergence of completely new business built on them.
To enable these new applications and the open web, it needs the appropriate infrastructure. The new web platform cannot be controlled by a single entity and its use is not limited due to insufficient scalability. It should be decentralized in design like the web itself and supported by a community of distributors widely so that the value they store cannot be monitored, modified or removed without permission from the users who store this value on their behalf.
A new decentralization technology (Blockchain), which has facilitated decentralized digital currencies like Bitcoin, has made billions of dollars in peer-to-peer transfers at a fraction of the price of the traditional banking system. This technology allows participants in the $ 50 billion + virtual goods economy to track, own and trade in these goods without permission. It allows real-world goods to cross into the digital domain, with verified ownership and tracking just like that of the digital.
Although the cost of storing data or performing a calculation on the Ethereum blockchain is thousands and millions of times higher than the cost of performing the same functionality on Amazon Web Services. A developer can always create a “central” app or even a central currency for a fraction of the cost of doing the same on a decentralized platform because a decentralized platform, by definition, will have many iterations in its operations and storage.
Bitcoin can be thought of as the first, very basic, version of this global community-run cloud, though it is primarily used only to store and move the Bitcoin digital currency.
Ethereum is the second and slightly more sophisticated version, which expanded the basic principles of Bitcoin to create a more general computing and storage platform, though it is a raw technology, which hasn’t achieved meaningful mainstream adoption.

1.1 WHY IS IT IMPORTANT TO PAY THE EXTRA COST TO SUPPORT DECENTRALIZATION?

Because some elements of value, for example bits representing digital currency ownership, personal identity, or asset notes, are very sensitive. While in the central system, the following players can change the value of any credits they come into direct contact with:
  1. The developer who controls the release or update of the application’s code
  2. The platform where the data is stored
  3. The servers which run the application’s code
Even if none of these players intend to operate with bad faith, the actions of governments, police forces and hackers can easily turn their hands against their users and censor, modify or steal the balances they are supposed to protect.
A typical user will trust a typical centralized application, despite its potential vulnerabilities, with everyday data and computation. Typically, only banks and governments are trusted sufficiently to maintain custody of the most sensitive information — balances of wealth and identity. But these entities are also subject to the very human forces of hubris, corruption and theft.
Especially after the 2008 global financial crisis, which demonstrated the fundamental problems of confidence in a highly indebted banking system. And governments around the
world apply significant capital controls to citizens during times of crisis. After these examples, it has become a truism that hackers now own most or all of your sensitive data.
These decentralized applications operate on a more complex infrastructure than today’s web but they have access to an instantaneous and global pool of currency, value and information that today’s web, where data is stored in the silos of individual corporations, cannot provide.

1.2 THE CHALLENGES OF CREATING A DECENTRALIZED CLOUD

A community-run system like this has very different challenges from centralized “cloud” infrastructure, which is running by a single entity or group of known entities. For example:
  1. It must be both inclusive to anyone and secure from manipulation or capture.
  2. Participants must be fairly compensated for their work while avoiding creating incentives for negligent or malicious behavior.
  3. It must be both game theoretically secure so good actors find the right equilibrium and resistant to manipulation so bad actors are actively prevented from negatively affecting the system.

2. NEAR

NEAR is a global community-run computing and storage cloud which is organized to be permissionless and which is economically incentivized to create a strong and decentralized data layer for the new web.
Essentially, it is a platform for running applications which have access to a shared — and secure — pool of money, identity and data which is owned by their users. More technically, it combines the features of partition-resistant networking, serverless compute and distributed storage into a new kind of platform.
NEAR is a community-managed, decentralized cloud storage and computing platform, designed to enable the open web in the future. It uses the same core technology for Bitcoin and Blockchain. On this web, everything can be created from new currencies to new applications to new industries, opening the door to an entirely new future.
NEAR is a decentralized community-run cloud computing and storage platform, which is designed to enable the open web of the future. On this web, everything from new currencies to new applications to new industries can be created, opening the door to a brand new future.
NEAR is a scalable computing and storage platform with the potential to change how systems are designed, how applications are built and how the web itself works.
It is a complex technology allow developers and entrepreneurs to easily and sustainably build applications which reap the benefits of decentralization and participate in the Open Web while minimizing the associated costs for end users.
NEAR creates the only community-managed cloud that is strong enough to power the future of the open web, as NEAR is designed from the ground up to deliver intuitive experiences to
end users, expand capacity across millions of devices, and provide developers with new and sustainable business models for their applications.
The NEAR Platform uses a token — also called “NEAR”. This token allows the users of these cloud resources, regardless of where they are in the world, to fairly compensate the providers of the services and to ensure that these participants operate in good faith.

2.1 WHY NEAR?

Through focus, we find that Platforms based on blockchain technologies like Bitcoin and Ethereum have made great progress and enriched the world with thousands of innovative applications spanning from games to decentralized financing.
However, these original networks and none of the networks that followed were not able to bridge the gap towards mainstream adoption of the applications created above them and do not provide this type of standard that fully supports the web.
This is a result of two key factors:
  1. System design
  2. Organization design
System design is relevant because the technical architecture of other platforms creates substantial problems with both usability and scalability which have made adoption nearly impossible by any but the most technical innovators. End-users experience 97–99% dropoff rates when using applications and developers find the process of creating and maintaining their applications endlessly frustrating.
Fixing these problems requires substantial and complex changes to current protocol architectures, something which existing organizations haven’t proven capable of implementing. Instead, they create multi-year backlogs of specification design and implementation, which result in their technology falling further and further behind.
NEAR’s platform and organization are architected specifically to solve the above-mentioned problems. The technical design is fanatically focused on creating the world’s most usable and scalable decentralized platform so global-scale applications can achieve real adoption. The organization and governance structure are designed to rapidly ship and continuously evolve the protocol so it will never become obsolete.

2.1.1 Features, which address these problems:

1. USABILITY FIRST
The most important problem that needs to be addressed is how to allow developers to create useful applications that users can use easily and that will capture the sustainable value of these developers.
2. End-User Usability
Developers will only build applications, which their end users can actually use. NEAR’s “progressive security” model allows developers to create experiences for their users which more closely resemble familiar web experiences by delaying onboarding, removing the need for user to learn “blockchain” concepts and limiting the number of permission-asking interactions the user must have to use the application.
1. Simple Onboarding: NEAR allows developers to take actions on behalf of their users, which allows them to onboard users without requiring these users to provide a wallet or interact with tokens immediately upon reaching an application. Because accounts keep track of application-specific keys, user accounts can also be used for the kind of “Single Sign On” (SSO) functionality that users are familiar with from the traditional web (eg “Login with Facebook/Google/Github/etc”).
2. Easy Subscriptions: Contract-based accounts allow for easy creation of subscriptions and custom permissioning for particular applications.
3. Familiar Usage Styles: The NEAR economic model allows developers to pay for usage on behalf of their users in order to hide the costs of infrastructure in a way that is in line with familiar web usage paradigms.
4. Predictable Pricing: NEAR prices transactions on the platform in simple terms, which allow end-users to experience predictable pricing and less cognitive load when using the platform.

2.1.2 Design principles and development NEAR’s platform

1. Usability: Applications deployed to the platform should be seamless to use for end users and seamless to create for developers. Wherever possible, the underlying technology itself should fade to the background or be hidden completely from end users. Wherever possible, developers should use familiar languages and patterns during the development process. Basic applications should be intuitive and simple to create while applications that are more robust should still be secure.
2. Scalability: The platform should scale with no upper limit as long as there is economic justification for doing so in order to support enterprise-grade, globally used applications.
3. Sustainable Decentralization: The platform should encourage significant decentralization in both the short term and the long term in order to properly secure the value it hosts. The platform — and community — should be widely and permissionlessly inclusive and actively encourage decentralization and participation. To maintain sustainability, both technological and community governance mechanisms should allow for practical iteration while avoiding capture by any single parties in the end.
4. Simplicity: The design of each of the system’s components should be as simple as possible in order to achieve their primary purpose. Optimize for simplicity, pragmatism and ease of understanding above theoretical perfection.

2.2 HOW NEAR WORKS?

NEAR’s platform provides a community-operated cloud infrastructure for deploying and running decentralized applications. It combines the features of a decentralized database with others of a serverless compute platform. The token, which allows this platform to run also, enables applications built on top of it to interact with each other in new ways. Together, these features allow developers to create censorship resistant back-ends for applications that deal with high stakes data like money, identity, assets, and open-state components, which interact seamlessly with each other. These application back-ends and components are called “smart contracts,” though we will often refer to these all as simply “applications” here.
The infrastructure, which makes up this cloud, is created from a potentially infinite number of “nodes” run by individuals around the world who offer portions of their CPU and hard drive space — whether on their laptops or more professionally deployed servers. Developers write smart contracts and deploy them to this cloud as if they were deploying to a single server, which is a process that feels very similar to how applications are deployed to existing centralized clouds.
Once the developer has deployed an application, called a “smart contract”, and marked it unchangeable (“immutable”), the application will now run for as long as at least a handful of members of the NEAR community continue to exist. When end users interact with that deployed application, they will generally do so through a familiar web or mobile interface just like any one of a million apps today.
In the central cloud hosted by some companies today like: Amazon or Google, developers pay for their apps every month based on the amount of usage needed, for example based on the number of requests created by users visiting their webpages. The NEAR platform similarly requires that either users or developers provide compensation for their usage to the community operators of this infrastructure. Like today’s cloud infrastructure, NEAR prices usage based on easy to understand metrics that aren’t heavily influenced by factors like system congestion. Such factors make it very complicated for developers on alternative blockchain-based systems today.
In the centralized cloud, the controlling corporation makes decisions unilaterally. NEAR community-run cloud is decentralized so updates must ultimately be accepted by a sufficient quorum of the network participants. Updates about its future are generated from the community and subject to an inclusive governance process, which balances efficiency and security.
In order to ensure that the operators of nodes — who are anonymous and potentially even malicious — run the code with good behavior, they participate in a staking process called “Proof of Stake”. In this process, they willingly put a portion of value at risk as a sort of deposit, which they will forfeit if it is proven that they have operated improperly.

2.2.1 Elements of the NEAR’s Platform

The NEAR platform is made up of many separate elements. Some of these are native to the platform itself while others are used in conjunction with or on top of it.
1. THE NEAR TOKEN
NEAR token is the fundamental native asset of the NEAR ecosystem and its functionality is enabled for all accounts. Each token is a unique digital asset similar to Ether, which can be used to:
a) Pay the system for processing transactions and storing data.
b) Run a validating node as part of the network by participating in the staking process.
c) Help determine how network resources are allocated and where its future technical direction will go by participating in governance processes.
The NEAR token enables the economic coordination of all participants who operate the network plus it enables new behaviors among the applications which are built on top of that network.
2. OTHER DIGITAL ASSETS
The platform is designed to easily store unique digital assets, which may include, but aren’t limited to:
  • Other Tokens: Tokens bridged from other chains (“wrapped”) or created atop the NEAR Platform can be easily stored and moved using the underlying platform. This allows many kinds of tokens to be used atop the platform to pay for goods and services. “Stablecoins,” specific kinds of token which are designed to match the price of another asset (like the US Dollar), are particularly useful for transacting on the network in this way.
  • Unique Digital Assets: Similar to tokens, digital assets (sometimes called “Non Fungible Tokens” (NFTs) ranging from in-game collectibles to representations of real-world asset ownership can be stored and moved using the platform.
3. THE NEAR PLATFORM
The core platform, which is made up of the cloud of community-operated nodes, is the most basic piece of infrastructure provided. Developers can permissionlessly deploy smart contracts to this cloud and users can permissionlessly use the applications they power. Applications, which could range from consumer-facing games to digital currencies, can store their state (data) securely on the platform. This is conceptually similar to the Ethereum platform.
Operations that require an account, network use, or storage at the top of the platform require payment to the platform in the form of transaction fees that the platform then distributes to its community from the authentication contract. These operations could include creating new accounts, publishing new contracts, implementing code by contract and storing or modifying data by contract.
As long as the rules of the protocol are followed, any independent developer can write software, which interfaces with it (for example, by submitting transactions, creating accounts or even running a new node client) without asking for anyone’s permission first.
4. THE NEAR DEVELOPMENT SUITE
Set of tools and reference implementations created to facilitate its use by those developers and end users who prefer them. These tools include:
  • NEAR SDKs: NEAR platform supports (Rust and AssemblyScript) languages to write smart contracts. To provide a great experience for developers, NEAR has a full SDK, which includes standard data structures, examples and testing tools for these two languages.
  • Gitpod for NEAR: NEAR uses existing technology Gitpod to create zero time onboarding experience for developers. Gitpod provides an online “Integrated Development Environment” (IDE), which NEAR customized to allow developers to easily write, test and deploy smart contracts from a web browser.
  • NEAR Wallet: A wallet is a basic place for developers and end users to store the assets they need to use the network. NEAR Wallet is a reference implementation that is intended to work seamlessly with the progressive security model that lets application developers design more effective user experiences. It will eventually include built-in functionality to easily enable participation by holders in staking and governance processes on the network.
  • NEAR Explorer: To aid with both debugging of contracts and the understanding of network performance, Explorer presents information from the blockchain in an easily digestible web-based format.
  • NEAR Command Line Tools: The NEAR team provides a set of straightforward command line tools to allow developers to easily create, test and deploy applications from their local environments.
All of these tools are being created in an open-source manner so they can be modified or deployed by anyone.

3. ECONOMIC

Primarily economic forces drive the ecosystem, which makes up the NEAR platform. This economy creates the incentives, which allow participants permissionlessly organize to drive the platform’s key functions while creating strong disincentives for undesirable, irresponsible or malicious behavior. In order for the platform to be effective, these incentives need to exist both in the short term and in the long term.
The NEAR platform is a market among participants interested in two aspects:
  • On the supply side, certification contract operators and other core infrastructure must be motivated to provide these services that make up the community cloud.
  • On the demand side, platform developers and end-users who pay for their use need to be able to do so in a simple, clear and consistent way that helps them.
Further, economic forces can also be applied to support the ecosystem as a whole. They can be used at a micro level to create new business models by directly compensating the developers who create its most useful applications. They can also be used at a macro level by coordinating the efforts of a broader set of ecosystem participants who participate in everything from education to governance.

3.1 NEAR ECONOMY DESIGN PRINCIPLES

NEAR’s overall system design principles are used to inform its economic design according to the following interpretations:
1. Usability: End users and developers should have predictable and consistent pricing for their usage of the network. Users should never lose data forever.
2. Scalability: The platform should scale at economically justified thresholds.
3. Simplicity: The design of each of the system’s components should be as simple as possible in order to achieve their primary purpose.
4. Sustainable Decentralization: The barrier for participation in the platform as a validating node should be set as low as possible in order to bring a wide range of participants. Over time, their participation should not drive wealth and control into the hands of a small number. Individual transactions made far in the future must be at least as secure as those made today in order to safeguard the value they modify.

3.2 ECONOMIC OVERVIEW

The NEAR economy is optimized to provide developers and end users with the easiest possible experience while still providing proper incentives for network security and ecosystem development.
Summary of the key ideas that drive the system:
  • Thresholded Proof of Stake: Validating node operators provide scarce and valuable compute resources to the network. In order to ensure that the computations they run are correct, they are required to “stake” NEAR tokens, which guarantee their results. If these results are found to be inaccurate, the staker loses their tokens. This is a fundamental mechanism for securing the network. The threshold for participating in the system is set algorithmically at the lowest level possible to allow for the broadest possible participation of validating nodes in a given “epoch” period (½ of a day).
  • Epoch Rewards: Node operators are paid for their service a fixed percentage of total supply as a “security” fee of roughly 4.5% annualized. This rate targets sufficient participation levels among stakers in order to secure the network while balancing with other usage of NEAR token in the ecosystem.
  • Protocol treasury: In addition to validators, protocol treasury received a 0.5% of total supply annually to continuously re-invest into ecosystem development.
  • Transaction Costs: Usage of the network consumes two separate kinds of resources — instantaneous and long term. Instantaneous costs are generated by every transaction because each transaction requires the usage of both the network itself and some of its computation resources. These are priced together as a mostly-predictable cost per transaction, which is paid in NEAR tokens.
  • Storage Costs: Storage is a long term cost because storing data represents an ongoing burden to the nodes of the network. Storage costs are covered by maintaining minimum balance of NEAR tokens on the account or contract. This provides indirect mechanism of payment via inflation to validators for maintaining contract and account state on their nodes.
  • Inflation: Inflation is determined as combination of payouts to validators and protocol treasury minus the collected transaction fees and few other NEAR burning mechanics (like name auction). Overall the maximum inflation is 5%, which can go down over time as network gets more usage and more transactions fees are burned. It’s possible that inflation becomes negative (total supply decreases) if there is enough fees burned.
  • Scaling Thresholds: In a network, which scales its capacity relative to the amount of usage it receives, the thresholds, which drive the network to bring on additional capacity are economic in nature.
  • Security Thresholds: Some thresholds, which provide for good behavior among participants are set using economic incentives. For example, “Fishermen” (described separately).
Full Report
submitted by CoinEx_Institution to Coinex [link] [comments]

small passive income while browsing the web

Introduction
If you were on the internet in the late 1990s, you might remember companies like "AllAdvantage" that promised to pay you to browse the web. You could install a program that tracked your browsing and showed you targeted ads at the top of the screen, then "AllAdvantage" would give you a cut of the ad revenue you generated.
These schemes largely disappeared after the dot-com crash. But Brendan Eich, the creator of the JavaScript programming language and cofounder and former CTO of Mozilla, thinks his company Brave Software has found a way to revive that old idea.
What is it?
Brave makes a browser based on Google Chrome that blocks tracking scripts and other technologies that spy on your online activity. As a result, it also blocks many web ads; if you visit any website using the Brave browser, you won’t see any ads. But Brave will give users the option to see ads that Eich says will respect your privacy. The ads will appear as desktop notifications, he says, not as replacements for the ads the Brave browser blocks. So you still won’t see ads on any website, but you might see them on the right lower corner of your screen. If you choose to see these ads, you’ll get 70 percent of the revenue they generate.
Eich hopes Brave can solve two of the web's most vexing problems the privacy and revenue problem by turning the traditional digital advertising model on its head. Today, ad networks pay sites for ad space and web browsers like Brave and Chrome deliver content from those publishers to users. Brave is trying to put the browser in the center of the advertising experience. Instead of paying publishers directly, ad networks would pay Brave, which will pass part of the money to users and keep a cut for itself.
By handling advertising in the browser on your device, Brave says it will be able to target ads without sending your data to the cloud, and protect your privacy. When you interact with an ad on Brave, the browser sends notice to the company's servers, but doesn't include any identifying information. Eich sees four sets of winners: browser makers get paid; users get paid, and get more privacy; advertisers can target pitches without running afoul of European privacy regulations; and publishers can survive in a world where many users are installing ad blockers.
Publishers and ad networks might bristle at the idea of putting browser makers in the middle of their business. But in recent years browsers have taken a more active role in shaping the web, instead of merely displaying a website’s content. Chrome now blocks ads on a small number of sites with particularly egregious advertising practices, while browsers like Firefox and Safari have added privacy protections. Meanwhile, browser plugins are giving users more control over their experience. There are Chrome extensions, for example, that let you change Facebook's color scheme, or change the way images are displayed on Pinterest. And of course there are extensions that block all ads.
Trying to win advertisers and publishers to a new model isn't Brave's only challenge. It also needs users. Eich says Brave has 15 million users and is growing.
Brave will give users a 70 percent cut of its advertising revenue, which Eich estimates could work out to about $10 a month. Brave will pay users with its own bitcoin-style "cryptocurrency” called Basic Attention Tokens or BAT, which has traded for as little as 24 cents over the past 12 months, according to CoinMarketCap. You can exchange the BAT you have received for viewing ads into USD, EUR, GBP, CHF and many more currencies.
The company offers a service through the cryptocurrency exchange Uphold to allow users to change, sell and buy BAT or donate it to publishers, and for publishers to exchange the BAT they receive for dollars. Advertisers like HomeDepot or recent campaigns included brands such as Verizon, Newegg, Chipotle, and PayPal/Honey, in addition to earlier campaigns by Amazon, Harry’s Razors, Intel, CBS, KIND snacks, Logitech, Lenovo, Grubhub, Belkin, Quickbooks, Evernote and some of cryptocurrency related companies, will be able to buy ads either with BAT or with traditional currencies.
Eich says Brave opted to create its own tokens using the Ethereum cryptocurrency platform in part to avoid regulatory requirements, such as verifying users' identifies, that partners like Uphold are better equipped to handle.
Estimated revenue? (depending on the country you live in the revenue can be higher or lower)
I made around 3oo$ so far this year using 3 devices, just for viewing some ads.
5 months so far july is not included if you calculate it down for 1 device, 100$/5months = 20$ a month just for viewing ads, you would need to buy risky stocks worth of 2000$ to get the same amount per month.
can only recommend everyone to try it, not every country has the same number of advertisers so you probably get the most out of it when you live in the USA.
If you are interested here is a quick guide how to set it up to get the max amount out of Brave:
Quickstartguide:
1 Download brave here
2 Activate the reward system (gif link below)Gif link
3 go into the settings an deactivate auto contribution and activate 5 ads per hour (image link below)image link
4 Create an Account on Uphold and connect it with your BraveBrowser.
Now you are good to go and can make some money on something you do anyway.
I hope this helps some folks in the community to make some extra bucks.
edit1:you can find more infos and support here:brave_browser & BATProject or www.brave.com
edit2:the earnings are depenging on the number of devices you are using and were you are living. Best paying countries: United States (69) United Kingdom (39) Canada (36) Australia (35) New Zealand (26) Germany (21) Ireland (21) France (18)( the number next to the country are the companies that are running ads on brave for this particular country, the more companies the more revenue )
you can find a full list with all countries and campaigns here: https://brave.com/transparency/
edit3:You don't need to browse to a certain website to receive ads, just browse as you are used to, play browser games, watch videos on youtube or do whatever you want.Sometimes Ads appear on the startpage looks like that https://i.imgur.com/5tohhRc.jpg and after some time on the right lower corner a clickable pop-up appears looks like that->https://i.imgur.com/CTGdVsu.png
edit4:If you want to import your bookmarks and settings from your old browser:on the right top corner of the browser is a button ->https://i.imgur.com/oi8EAri.jpg click it > than on settings > and than you got the option to import bookmarks and settings from your old browser.
If you want to sync brave between devices and for backups:type brave://flags/ into the adressbar and than brave sync into the search bar and acticate itif its enabled it should look like this https://imgur.com/a/tCMDgDjthan just click on sync ->https://i.imgur.com/oi8EAri.jpg
here is a guide ->https://support.brave.com/hc/en-us/articles/360021218111-How-do-I-set-up-Sync
edit5: Don't keep your BAT from free token grants to long in your browser, always send your bat to an external wallet or exchange like uphold, only tokens from free token grants have an expire date if they dont get used they go back to the bat pool. you can find more infos about this here -> https://support.brave.com/hc/en-us/articles/360018305731-Why-does-my-BAT-have-an-expiration-date-
submitted by OnlyReveal6 to beermoneyglobal [link] [comments]

Don't blindly follow a narrative, its bad for you and its bad for crypto in general

I mostly lurk around here but I see a pattern repeating over and over again here and in multiple communities so I have to post. I'm just posting this here because I appreciate the fact that this sub is a place of free speech and maybe something productive can come out from this post, while bitcoin is just fucking censorship, memes and moon/lambo posts. If you don't agree, write in the comments why, instead of downvoting. You don't have to upvote either, but when you downvote you are killing the opportunity to have discussion. If you downvote or comment that I'm wrong without providing any counterpoints you are no better than the BTC maxis you despise.
In various communities I see a narrative being used to bring people in and making them follow something without thinking for themselves. In crypto I see this mostly in BTC vs BCH tribalistic arguments:
- BTC community: "Everything that is not BTC is shitcoin." or more recently as stated by adam on twitter, "Everything that is not BTC is a ponzi scheme, even ETH.", "what is ETH supply?", and even that they are doing this for "altruistic" reasons, to "protect" the newcomers. Very convenient for them that they are protecting the newcomers by having them buy their bags
- BCH community: "BTC maxis are dumb", "just increase block size and you will have truly p2p electronic cash", "It is just that simple, there are no trade offs", "if you don't agree with me you are a BTC maxi", "BCH is satoshi's vision for p2p electronic cash"
It is not exclusive to crypto but also politics, and you see this over and over again on twitter and on reddit.
My point is, that narratives are created so people don't have to think, they just choose a narrative that is easy to follow and makes sense for them, and stick with it. And people keep repeating these narratives to bring other people in, maybe by ignorance, because they truly believe it without questioning, or maybe by self interest, because they want to shill you their bags.
Because this is BCH community, and because bitcoin is censored, so I can't post there about the problems in the BTC narrative (some of which are IMO correctly identified by BCH community), I will stick with the narrative I see in the BCH community.
The culprit of this post was firstly this post by user u/scotty321 "The BTC Paradox: “A 1 MB blocksize enables poor people to run their own node!” “Okay, then what?” “Poor people won’t be able to use the network!”". You will see many posts of this kind being made by u/Egon_1 also. Then you have also this comment in that thread by u/fuck_____________1 saying that people that want to run their own nodes are retarded and that there is no reason to want to do that. "Just trust block explorer websites". And the post and comment were highly upvoted. Really? You really think that there is no problem in having just a few nodes on the network? And that the only thing that secures the network are miners?
As stated by user u/co1nsurf3r in that thread:
While I don't think that everybody needs to run a node, a full node does publish blocks it considers valid to other nodes. This does not amount to much if you only consider a single node in the network, but many "honest" full nodes in the network will reduce the probability of a valid block being withheld from the network by a collusion of "hostile" node operators.
But surely this will not get attention here, and will be downvoted by those people that promote the narrative that there is no trade off in increasing the blocksize and the people that don't see it are retarded or are btc maxis.
The only narrative I stick to and have been for many years now is that cryptocurrency takes power from the government and gives power to the individual, so you are not restricted to your economy as you can participate in the global economy. There is also the narrative of banking the bankless, which I hope will come true, but it is not a use case we are seeing right now.
Some people would argue that removing power from gov's is a bad thing, but you can't deny the fact that gov's can't control crypto (at least we would want them not to).
But, if you really want the individuals to remain in control of their money and transact with anyone in the world, the network needs to be very resistant to any kind of attacks. How can you have p2p electronic cash if your network just has a handful couple of nodes and the chinese gov can locate them and just block communication to them? I'm not saying that this is BCH case, I'm just refuting the fact that there is no value in running your own node. If you are relying on block explorers, the gov can just block the communication to the block explorer websites. Then what? Who will you trust to get chain information? The nodes needs to be decentralized so if you take one node down, many more can appear so it is hard to censor and you don't have few points of failure.
Right now BTC is focusing on that use case of being difficult to censor. But with that comes the problem that is very expensive to transact on the network, which breaks the purpose of anyone being able to participate. Obviously I do think that is also a major problem, and lightning network is awful right now and probably still years away of being usable, if it ever will. The best solution is up for debate, but thinking that you just have to increase the blocksize and there is no trade off is just naive or misleading. BCH is doing a good thing in trying to come with a solution that is inclusive and promotes cheap and fast transactions, but also don't forget centralization is a major concern and nothing to just shrug off.
Saying that "a 1 MB blocksize enables poor people to run their own" and that because of that "Poor people won’t be able to use the network" is a misrepresentation designed to promote a narrative. Because 1MB is not to allow "poor" people to run their node, it is to facilitate as many people to run a node to promote decentralization and avoid censorship.
Also an elephant in the room that you will not see being discussed in either BTC or BCH communities is that mining pools are heavily centralized. And I'm not talking about miners being mostly in china, but also that big pools control a lot of hashing power both in BTC and BCH, and that is terrible for the purpose of crypto.
Other projects are trying to solve that. Will they be successful? I don't know, I hope so, because I don't buy into any narrative. There are many challenges and I want to see crypto succeed as a whole. As always guys, DYOR and always question if you are not blindly following a narrative. I'm sure I will be called BTC maxi but maybe some people will find value in this. Don't trust guys that are always posting silly "gocha's" against the other "tribe".
EDIT: User u/ShadowOfHarbringer has pointed me to some threads that this has been discussed in the past and I will just put my take on them here for visibility, as I will be using this thread as a reference in future discussions I engage:
When there was only 2 nodes in the network, adding a third node increased redundancy and resiliency of the network as a whole in a significant way. When there is thousands of nodes in the network, adding yet another node only marginally increase the redundancy and resiliency of the network. So the question then becomes a matter of personal judgement of how much that added redundancy and resiliency is worth. For the absolutist, it is absolutely worth it and everyone on this planet should do their part.
What is the magical number of nodes that makes it counterproductive to add new nodes? Did he do any math? Does BCH achieve this holy grail safe number of nodes? Guess what, nobody knows at what number of nodes is starts to be marginally irrelevant to add new nodes. Even BTC today could still not have enough nodes to be safe. If you can't know for sure that you are safe, it is better to try to be safer than sorry. Thousands of nodes is still not enough, as I said, it is much cheaper to run a full node as it is to mine. If it costs millions in hash power to do a 51% attack on the block generation it means nothing if it costs less than $10k to run more nodes than there are in total in the network and cause havoc and slowing people from using the network. Or using bot farms to DDoS the 1000s of nodes in the network. Not all attacks are monetarily motivated. When you have governments with billions of dollars at their disposal and something that could threat their power they could do anything they could to stop people from using it, and the cheapest it is to do so the better
You should run a full node if you're a big business with e.g. >$100k/month in volume, or if you run a service that requires high fraud resistance and validation certainty for payments sent your way (e.g. an exchange). For most other users of Bitcoin, there's no good reason to run a full node unless you reel like it.
Shouldn't individuals benefit from fraud resistance too? Why just businesses?
Personally, I think it's a good idea to make sure that people can easily run a full node because they feel like it, and that it's desirable to keep full node resource requirements reasonable for an enthusiast/hobbyist whenever possible. This might seem to be at odds with the concept of making a worldwide digital cash system in which all transactions are validated by everybody, but after having done the math and some of the code myself, I believe that we should be able to have our cake and eat it too.
This is recurrent argument, but also no math provided, "just trust me I did the math"
The biggest reason individuals may want to run their own node is to increase their privacy. SPV wallets rely on others (nodes or ElectronX servers) who may learn their addresses.
It is a reason and valid one but not the biggest reason
If you do it for fun and experimental it good. If you do it for extra privacy it's ok. If you do it to help the network don't. You are just slowing down miners and exchanges.
Yes it will slow down the network, but that shows how people just don't get the the trade off they are doing
I will just copy/paste what Satoshi Nakamoto said in his own words. "The current system where every user is a network node is not the intended configuration for large scale. That would be like every Usenet user runs their own NNTP server."
Another "it is all or nothing argument" and quoting satoshi to try and prove their point. Just because every user doesn't need to be also a full node doesn't mean that there aren't serious risks for having few nodes
For this to have any importance in practice, all of the miners, all of the exchanges, all of the explorers and all of the economic nodes should go rogue all at once. Collude to change consensus. If you have a node you can detect this. It doesn't do much, because such a scenario is impossible in practice.
Not true because as I said, you can DDoS the current nodes or run more malicious nodes than that there currently are, because is cheap to do so
Non-mining nodes don't contribute to adding data to the blockchain ledger, but they do play a part in propagating transactions that aren't yet in blocks (the mempool). Bitcoin client implementations can have different validations for transactions they see outside of blocks and transactions they see inside of blocks; this allows for "soft forks" to add new types of transactions without completely breaking older clients (while a transaction is in the mempool, a node receiving a transaction that's a new/unknown type could drop it as not a valid transaction (not propagate it to its peers), but if that same transaction ends up in a block and that node receives the block, they accept the block (and the transaction in it) as valid (and therefore don't get left behind on the blockchain and become a fork). The participation in the mempool is a sort of "herd immunity" protection for the network, and it was a key talking point for the "User Activated Soft Fork" (UASF) around the time the Segregated Witness feature was trying to be added in. If a certain percentage of nodes updated their software to not propagate certain types of transactions (or not communicate with certain types of nodes), then they can control what gets into a block (someone wanting to get that sort of transaction into a block would need to communicate directly to a mining node, or communicate only through nodes that weren't blocking that sort of transaction) if a certain threshold of nodes adheres to those same validation rules. It's less specific than the influence on the blockchain data that mining nodes have, but it's definitely not nothing.
The first reasonable comment in that thread but is deep down there with only 1 upvote
The addition of non-mining nodes does not add to the efficiency of the network, but actually takes away from it because of the latency issue.
That is true and is actually a trade off you are making, sacrificing security to have scalability
The addition of non-mining nodes has little to no effect on security, since you only need to destroy mining ones to take down the network
It is true that if you destroy mining nodes you take down the network from producing new blocks (temporarily), even if you have a lot of non mining nodes. But, it still better than if you take down the mining nodes who are also the only full nodes. If the miners are not the only full nodes, at least you still have full nodes with the blockchain data so new miners can download it and join. If all the miners are also the full nodes and you take them down, where will you get all the past blockchain data to start mining again? Just pray that the miners that were taken down come back online at some point in the future?
The real limiting factor is ISP's: Imagine a situation where one service provider defrauds 4000 different nodes. Did the excessive amount of nodes help at all, when they have all been defrauded by the same service provider? If there are only 30 ISP's in the world, how many nodes do we REALLY need?
You cant defraud if the connection is encrypted. Use TOR for example, it is hard for ISP's to know what you are doing.
Satoshi specifically said in the white paper that after a certain point, number of nodes needed plateaus, meaning after a certain point, adding more nodes is actually counterintuitive, which we also demonstrated. (the latency issue). So, we have adequately demonstrated why running non-mining nodes does not add additional value or security to the network.
Again, what is the number of nodes that makes it counterproductive? Did he do any math?
There's also the matter of economically significant nodes and the role they play in consensus. Sure, nobody cares about your average joe's "full node" where he is "keeping his own ledger to keep the miners honest", as it has no significance to the economy and the miners couldn't give a damn about it. However, if say some major exchanges got together to protest a miner activated fork, they would have some protest power against that fork because many people use their service. Of course, there still needs to be miners running on said "protest fork" to keep the chain running, but miners do follow the money and if they got caught mining a fork that none of the major exchanges were trading, they could be coaxed over to said "protest fork".
In consensus, what matters about nodes is only the number, economical power of the node doesn't mean nothing, the protocol doesn't see the net worth of the individual or organization running that node.
Running a full node that is not mining and not involved is spending or receiving payments is of very little use. It helps to make sure network traffic is broadcast, and is another copy of the blockchain, but that is all (and is probably not needed in a healthy coin with many other nodes)
He gets it right (broadcasting transaction and keeping a copy of the blockchain) but he dismisses the importance of it
submitted by r0bo7 to btc [link] [comments]

Update and Few Thoughts, a (Well-Typed) transcript: Liza&Charles the marketeers, Voltaire kick-off, PrisM and Ebb-and-Flow to fuck ETH2.0 Gasper, the (back)log of a man and a falcon, lots of companies, September Goguen time, Basho, 2021 Titans, Basho, Hydra and much more thoughts and prayers

Hi everybody this is Charles Hoskinson broadcasting live from warm sunny Colorado. I'm trying a new streaming service and it allows me to annotate a few things and simulcast to both periscope and youtube. Let's see how this works. I also get to put a little caption. I think for the future, I'm just for a while going to put: "I will never give away ada". So, when people repost my videos for giveaway scams they at least have that. First off, a thank you, a community member named Daryl had decided to carve a log and give his artistic impression of my twitter profile picture of me and the falcon so that always means a lot when I get these gifts from fans and also I just wanted to, on the back of the Catalyst presentation, express my profound gratitude and excitement to the community.
You know it's really really cool to see how much progress has been made in such a short period of time. It was only yesterday when we were saying "when Shelley"? Now Shelley's out and it's evolving rapidly. Voltaire is now starting to evolve rapidly and we're real close to Goguen. At the end of this month we'll be able to talk around some of the realities of Goguen and some of the ideas we have and give some dates for certain things and give you a sense of where that project is at. The good news is that we have gained an enormous amount of progress and knowledge about what we need to do and how to get that done and basically people are just executing and it's a much smaller task than getting us to Shelley. With Byron to Shelley we literally had to build a completely new cryptocurrency from the ground up. We had to have new ledger rules, new update system, we had to invent a way of transitioning from one system to another system and there's hundreds of other little innovations along the way: new network stack and so forth. Byron cosmetically looks like Shelley but under the hood it's completely different and the Shelley design was built with a lot of the things that we needed for Goguen in mind. For example, we built Shelley with the idea of extended UTXO and we built Shelley understanding what the realities were for the smart contract model and that's one of the advantages you get when you do this type of bespoke engineering. There's two consequences to that, one, the integration is significantly easier, and two, the integration is significantly faster. We won't look at that same complexity there.
The product update at the end of the month... We'll really start discussing around some of these things as well as talk about partners and talk about how the development ecosystem is going to evolve. There are a lot of threads throughout all three organizations that are happening simultaneously. Emurgo, they're really thinking deeply about DeFi and they've invited us to collaborate with them on things like stablecoins for example but we're also looking at oracles (oracle pools), DEX and these other things and because there are already people in market who have made mistakes, learned lessons, it gives us the benefit of hindsight. It means we can be much faster to market and we can build much more competitive things in market and the Cardano community gets first access to these next generation DeFi applications without a lot of the problems of the prior generations and that's super beneficial to us.
You know, the other side of it, is that Voltaire is going to have a systemic influence not just on community funding but also the overall evolution and direction of the platform. The longer it exists the more pervasive it will become. Probably first applied towards the Cardano foundation roadmap but later on it will definitely have a lot of influence and say over every element aspect of the system including the launch dApps and these other things. Basically, long term, the types of problems that Cardano solves so that's incredibly appealing to me and very exciting to me because it's like I have this giant community brain with the best and brightest of all of you working with us to get us where we need to go.
You know, another thing that was super encouraging, it's a small thing, but it shows us that we're definitely in the right direction was that we recently got a demo from Pramod (Viswanath) and his team out of university of Illinois on a protocol they create called PrisM which is a super fast proof-of-work protocol and they wrote this beautiful paper and they wrote code along with it that showed that PrisM is a ten thousand times faster than Nakamoto consensus. If you take the bitcoin proof-of-work protocol, you strip it out, you put PrisM in, you can run the entire bitcoin system 10000 times faster. They have these beautiful benchmarks to show that. Even in bad network conditions. (I'm) promoting this team, they're, they're real researchers, and they're real engineers, they use a lot of cool HPC concepts like springboarding and other things like that to accommodate that. Then I asked him in the presentation, I said well, how much faster if you replay the Ethereum chain? He says, well, that it takes a big performance hit, could be only maybe a hundred times because that model is not as easy to optimize and shard with standard computer science concepts. In fact in some cases there are limitations there that really can't be overcome. It turns out that we're more on that UTXO side than we are on the account side. As a coincidence or intent of the design of extended UTXO we're gonna have a lot easier time getting much higher performance where and when it's necessary.
I also approved this week a scaling up of the Basho project. In particular, to build a hydra prototype team. The science has gotten to a point where we can make a really competitive push in that particular direction. What does that mean? It means that in just a few short months we can de-risk technological approaches that long-term will give us a lot of fruit where and when the community decides that they need infrastructure like hydra. Now, here's the beautiful thing about hydra. If you watch my whiteboard back in September of 2017 when Cardano first hit market with Byron I talked about this concept of looking at scalability with a very simple test which is as you get more people in the system it stays at the same performance or it gets faster. We all experience systems that do this, for example, bittorrent, more people downloading something you tend to be able to get it faster and we all experience the converse which is, the system gets slower when you get more people. What does this mean? It means that hydra is an actual approach towards true scalability in the system and it's a lot easier to do than sharding even though we have a beautiful approach to get the sharding on the ledger side if we truly desire to go down that way. There's beautiful ideas that we are definitely in deep discussions about. That's a very complex thing. There was recently a paper ("Ebb-and-Flow Protocols: A Resolution of the Availability-Finality Dilemma") out of Stanford that showed that the Gasper protocol as proposed for ETH2.0 does have some security concerns and it's going to be the burden on the shoulders of the Ethereum 2.0 developers and Vitalik to address those concerns from those Stanford professors. Whenever you have these very complex protocols they have so many different ways they can break and things can go wrong so it's much more appealing when you don't have to embrace complexity to achieve the same. The elegance of hydra is that stake pool operators are very natural parties to put hydra channels on and every time we add one we get much more performance out of that and the system as it gets more valuable. The k factor increases which means you get more stake pull operators, which means you get more hydra channels, so with growth we get appreciation, with appreciation we get more decentralization, with more decentralization we get more performance. In essence, this spiritually speaking, is really what we meant when we said scalability. That the system will always grow to meet its particular needs and we have a very elegant way of moving in that direction that doesn't require us to embrace very sophisticated techniques. It's not to say that these techniques don't have a place and purpose but it says that the urgency of implementing these is gone and we then have the luxury to pick the best science when it's ready instead of rushing it to market to resolve a crisis of high fees. We'll never have that crisis so there's a beauty to Cardano that is missing, I in my view, from many cryptocurrencies and blockchains in the marketplace and we're now seeing that beauty shine through. Not only through our community who are so passionate and amazing but in the science and the engineering itself and how easy it is for us to navigate the concepts. How easy it is for us to add more things, to take some things away, to clean some things up here and there and our ability to move through.
I never imagined when in 2015 I signed up to go in on this crazy ride and try to build a world financial operating system we would have made as much progress as we made today. We've written more than 75 research papers as an organization many of which are directly applicable to Cardano. We've got great partners who work with Nasa and Boeing and Pfizer, massive companies, that have 10 years of history and millions of users to come in and help us grow better. We've worked with incredible organizations, major universities like university of Wyoming, university of Edinburgh, Tokyo, tech professors all across the world. We've worked with incredible engineering firms like VacuumLabs and AtixLabs and Twig and Well-Typed, runtime verification, QuviQ and dozens of others along the years and despite the fact that at times there's been delays and friction throughout this entire journey we've mostly been aligned and we keep learning and growing. It gives me so much hope that our best days are ahead of us and an almost fanatical belief that success is inevitable in a certain respect. You see because we always find a way to be here tomorrow and we always find a way to make tomorrow a better day than today and as long as that's the trend you're monotonically increasing towards a better tomorrow, you're always going to have that outcome, you're always going to be in a position where Cardano shines bright. Towards the end of the month we'll have a lot more to say about the development side and that'll be a beginning just like Voltaire is the beginning and then suddenly you now notice the beautiful parallelism of the roadmap. Shelley continues to evolve, partial delegation is coming, in fact, I signed the contract with vacuumlabs to bring that to Ledger (and Trezor). The Daedalus team is hard at work to make that feature apparent for everyone as is the Yoroi team.
You see that, with now Voltaire, and soon was Goguen, and these are not endpoints, rather they're just beginnings and they're never over. We can always make staking better, more diverse, more merit-based and entertain different control models, have better delegation mechanics, have better user experience. The same for smart contracts, that's an endless river and along the way what we've discovered is it's easy for us to work with great minds and great people. For example with testing of smart contracts I would love to diversify that conversation above and beyond what we can come up with and bring in some firms who have done this for a long time to basically take that part with us shoulder to shoulder and build beautiful frameworks to assist us. For example, runtime verification is doing this with, the EVM with a beautiful project called Firefly to replace Truffle. I believe that we can achieve similar ends with Plutus smart contracts.
When you ask yourself what makes a system competitive in the cryptocurrency space? In my view there are four dimensions and you have to have a good story for all four of those dimensions. You need security and correctness. A lot of people don't prioritize that but when they get that wrong it hurts retail people, it hurts everyday people, billions of dollars have been lost due to the incompetence and ineptitude of junior developers making very bad mistakes and oftentimes those developers faced no consequences. The people who lost money were innocent people who believed in cryptocurrencies and wanted to be part of the movement but didn't protect themselves adequately. That's a really sad thing and it's unethical to continue pushing a model that that is the standard or the likely outcome rather than a rare edge case. You have to as a platform, a third generation platformn invest heavily in giving the developers proper tools to ensure security and correctness. We've seen a whole industry there's been great innovations out of Quantstamp and ConsenSys and dozens of other firms in the space including runtime verification who have really made major leaps in the last few years of trying to improve that story. What's unique to Cardano is that we based our foundations on languages that were designed right the first time and there's over 35 years of history for the approach that we're following in the Haskell side that allows us to build high assurance systems and our developers in the ecosystem to build high assurance systems. We didn't reinvent the wheel, we found the best wheel and we're giving it to you.
I think we're going to be dominant in that respect as we enter 2021. Second, you look at things like ease of maintenance, ease of deployment, the life cycle of the software upgrades to the software and as we've demonstrated with things like the hard fork combinator and the fact that Voltaire is not just a governance layer for ada and Cardano but will eventually be reusable for any dApp deployed on our system. You have very natural tooling that's going to allow people to upgrade their smart contracts, their dApps and enable governance for their users at an incredibly low cost and not have to reinvent the governance wheel each and every application. This is another unique property to our system and it can be reused for the dApps that you deploy on your system as I've mentioned before. Performance is a significant concern and this was often corrupted by marketers especially ICO marketers who really wanted to differentiate (and) say: "our protocol tested on a single server in someone's basement is 500000 transactions per second" and somehow that translates to real life performance and that's antithetical to anyone who's ever to study distributed systems and understands the reality of these systems and where they go and what they do and in terms of performance. I think we have the most logical approach. You know, we have 10 years of history with bitcoin, it's a massive system, we've learned a huge amount and there's a lot of papers written about, a lot of practical projects and bitcoin is about to step into the world of smart contracts. We congratulate them on getting Schnorr sigs in and the success of Taproot. That means entering 2021, 2022, we are going to start seeing legitimate dApps DeFi projects, real applications, instead of choosing Ethereum or Algorand, EOS, Cardano, choosing bitcoin and they're adding a lot to that conversation. I think that ultimately that model has a lot of promise which is why we built a better one. There are still significant limitations with what bitcoin can accomplish from settlement time to the verbosity of contracts that can be written.
The extended UTXO model was designed to be the fastest accounting and most charitable accounting model ever, on and off chain, and hydra was designed to allow you to flex between those two systems seamlessly. When you look at the foundations of where we're at and how we can extend this from domain specific languages, for domain experts, such as Marlowe to financial experts, and the DSLs that will come later, for others, like lawyers and supply chain experts in medical databases and so forth and how easy it is to write and deploy these. Plutus being beautiful glue code for both on and off chain communications. I think we have an incredibly competitive offering for performance and when hydra comes, simply put, there'll be no one faster. If we need to shard, we're going to do that and definitely better than anybody else because we know where our security model sits and there won't be surprise Stanford papers to blindside us that require immediate addressing.
In terms of operating costs, this is the last component, in my view, and that's basically how much does it cost you the developer to run your application? There are really two dimensions, one is predictability and the other is amount. It's not just good enough to say: it's a penny per transaction today. You need to know that after you spend millions of dollars and months or years of effort building something and deploying something that you're not going to wake up tomorrow and now it's five dollars to do what used to cost a penny. You need that cost to be as low as possible and as predictable as possible and again the way that we architectured our system and as we turn things on towards the end of this year and as we enter into the next year we believe we have a great approach to achieve low operating cost. One person asks why Cardano? Well because we have great security and correctness in the development experience and tools with 35 years of legacy that were built right the first time and don't put the burdens of mistakes on your customers. They ask why Cardano and we say: well the chain itself is going to give you great solutions with identity value transformation and governance itself and as a consequence when you talk about upgrading your applications having a relationship with your customers of your applications and you talk about the ease of maintenance of those applications. There's going to be a good story there and we have beautiful frameworks like Voltaire that allow that story to evolve and we keep adding partners and who have decades of experience to get us along. We won't stop until it's much better. They asked why Cardano? We said because at the moment we're 10 times faster today than Ethereum today and that's all we really need for this year and next year to be honest and in the future we can be as fast as we need to be because we're truly scalable. As the system gets more decentralized the system improves performance and where and when we need to shard we can do that. We'll have the luxury of time to do it right, the Cardano way, and when people ask why Cardano? Because the reality is, it's very cheap to do things on our platform and the way we're building things. That's going to continue being the case and we have the governance mechanisms to allow the community to readjust fees and parameters so that it can continue being affordable for users. Everything in the system will eventually be customizable and parameterizable: from block size, to transaction fees and the community will be in a good position to dynamically allocate these things where and when needed so that we can enjoy as an ecosystem predictability in our cost.
In the coming weeks and months, especially in my company, we're going to invest a lot of time and effort into comparison marketing and product marketing. When I see people say, oh well, you've launched proof of stake, a lot of other people have done. I don't think those people fully appreciate the magnitude of what we actually accomplished as an ecosystem and the quality of the protocols that are in distribution. That's not their fault, it's our fault, because we didn't take the time in simplistic terms, not scientific papers and deep code and formal specifications, but rather everyday language, to really show why we're different. I admit that that's a product failing and that needs to be corrected so we hired a great marketing director, named Liza (Horowitz?) and she is going to work full time with me and others in the ecosystem, a great team of people, every single day to get out there and explain what we have done is novel, unique, competitive and special to our industry. Everything from Ouroboros and contrast to major other protocols from the EOSes and Algorands and the Tezos of the world. Why we're different, trade-offs we chose over them, to our network stack, to the extended UTXO model, to Plutus, to Marlowe and we're going to keep hammering away at that until we get it right and everybody acknowledges and sees what has been accomplished.
I've spent five years of my life, good years of my life, and missed a lot to get this project where it needs to go. All of our employees have invested huge sums of their personal lives, their time, their brand, their careers, in trying to make this the really most magical and special cryptocurrency and blockchain infrastructure around. No one ever signed up in this company or the other companies working on Cardano to work on a mediocre protocol. That's just another blockchain, they signed up to change the world, they signed up to build a system that legitimately can look at you in the face and say: one day we have the potential to have a billion users! That's what they signed up for and they showed up to play. They built technology that evolves in that direction with some certainty and great foundations and we have an obligation to market in a way that can show the world why, succinctly, with clarity. Understandably, this has been a failing in the past but you know what? You can always be better tomorrow that monotonically increasing make it better and that's what we're going to do. We recognized it and we're going to invest in it and with Voltaire if we can't do it. You the community can do it and we'll work with you. If you can do a better job and the funding will be there to get that done. In addition to this, we think about 2021 and we ask where does the future take us? I've thought a lot about this you know I've thought a lot about how do we get the next five years as we close out 2020 and here's the reality: we're not going to leave as a company until we have smart contracts and multi-asset and Voltaire has evolved to a point where the community can comfortably make decisions about the future of the protocol and that the staking experience has solidified and it's stable.
I don't care if this costs me millions or tens of millions of dollars out of my own pocket to make happen. I'm going to do that because that's my commitment to you, the community and every product update will keep pushing our way there. We'll continue to get more transparent, we'll continue to get more aggressive and hire more and parallelize more. Aware when we can, to deliver that experience so that Cardano gets where it needs to go. Then when we ask about where do we go next? The reality is that the science as an industry, the engineering as an industry has given a menu of incredibly unique attractive and sexy things that we can pursue. What we're going to do is work with the community and the very same tools that are turning on today, the Voltaire tools, the cardano.ideascale.com tools and we're going to propose a consortium and we're going to bring the best and brightest together and give a vision of where we can take the system in another five years. With the benefit of hindsight, massively improved processes, better estimation capabilities and the fact that we're not starting with two people at IOG. We're starting with 250 people and the best scientific division in our industry and the legacy of almost, nearly by the end of this year, 100 scientific papers. That's us, you know what, there's dozens of companies throughout the history who have worked on Cardano. It's about time to scale them up too and get client diversity. So come next year when the protocol has evolved to the point where it's ready for it, we'll have that conversation with you the community and that's going to be a beautiful conversation. At the conclusion of it, there's going to be certainty of how we're going to evolve over the next five years to get ourselves beyond the cryptocurrency space. I'm very tired of these conversations we have about: are you going to go to (coindesk's) consensus or not? Or who's going to be the big winner? What about Libra or what about this particular regulation and this crypto unicorn and this thing?
You know I've been in the space a long time and I've noticed that people keep saying the same things year after year in the same venues. Yes, the crowd sizes get larger and the amount of value at risk gets larger but I haven't seen a lot of progress in the places where I feel it is absolutely necessary for this technology to be permanent in the developing world. We need to see economic identity. People often ask what is the mission for Cardano? For us IOG, you look at economic identity and you take a look at a roadmap. For it, you scale up and down, and each and every step along the way, from open data, to self-sovereign identity, to financial inclusion. You can keep going down: to decentralized lending, decentralized insurance, decentralized banking. Each and every step along the way to economic identity. When you admit a blockchain tells you that, there's a collection of applications and infrastructure that you need to build.
My life's work is to get to a point where we have the technology to do that. The infrastructure to do that, with principles, and so we'll keep evolving Cardano and we'll keep evolving the space as a whole and the science as a whole until I can wake up and say: each box and that road to economic identity, for all people not just one group, we have a solution for that. I'm going to put those applications on Cardano and success for me is not about us being king of the crypto hill and having a higher market cap than bitcoin or being entrepreneur of the year coindesk's most influential person. It's meaningless noise, success for me is reflecting back at the things that we have accomplished together and recognizing that millions if not billions now live in a system where they all matter, they all have a voice, they all have an equal footing. The Jeff Bezos of the world have the very same experience as the person born in Rwanda and we're not done until that's the case. It's a long road, it's a hard road, but you know what? We're making progress, we have great people in Africa, we have great people in eastern Europe, we have great people in southeast Asia and great partners all along the way. Great people, Latin America, great people in south America, great people here in the United States.
When we talk about economic identity there are millions, if not tens of millions of Americans who don't have it. Same for Canadians, hundreds of thousands, who don't have it. Developed western cultures, it's the greatest blind spot of policy and as we enter into a depression as a result of coronavirus, add millions if not tens of millions more onto that list. Generations are being disenfranchised by this legacy system and we as an ecosystem, we as an entire community are offering a different way forward. Not hyper centralizationn not social credit but a way forward where you own your own money, your own identity, your own data. You're not a victim of surveillance capitalism, you're not a victim of civil asset forfeiture. When you say the wrong things, you get shut out of society. Each and every human being matters and I'm optimistic to believe that when you remind people that they matter they're gonna rise to the occasion. That is the point of my company. In the things that we do each and every day, that's our mission to give the platforms to the world so that those who don't have economic identity can get it and they can keep it and no one can take it from them and they can enjoy an ever increasing growth of standard of living wealth and prosperity.
However you want to measure that this is my goal post, I couldn't care less about the cryptocurrency space. It was a great place to start but the space needs to be reminded why it exists. Bitcoin was given a mandate on the back of the 2008 financial crisis to do something different. It was not given a mandate to go be a new settlement layer for central banks or a new way for the old guard to make more money and banks get bigger and for those who are in control to preserve their power. The whole point of doing something so crazy as to buy a coin that doesn't even exist in real life, that's just a bunch of numbers in the cloud, the whole point of that was so that we as a society could do something different than the way that we'd been doing things before. So, each and every member of the cryptocurrency space needs to remind everyone else from time to time why we're here and where did we come from and where are we going to go.
The beauty of Cardano is we have already achieved for the most part a decentralized brain and that momentum is pushing harder than ever. More and more scientists are waking up, more and more institutions are waking up, getting us there. The code we have, the right approach and I think we have a great competitive offering for 2021 as we go and battle the titans and that's going to be a lot of fun but we know who we are and where we're going and we're in the right places. It's so incredibly encouraging to see the stake pool operators not just be from California or Texas or New York or Canada. To see a lot of stake pool operators from the place that need the most, help everybody does matter and it means a lot to me for the people who are there but it means a lot to everybody to say that we have created an equal platform. It makes the participation of all of us so much more meaningful. We're not just talking to each other, we're talking to the world and by working together on this platform we're lifting the world up and giving people hope. That's the point, there's a lot more to do, we didn't get everything done. You never do you aspire, you work hard, you set a moon, shot and sometimes you can just get to orbit with the first go but you know what? When you build the next rocket you can go to Mars.
Thank you all for being with me, thank you all for being part of this. Today was a damn good day with the announcement of Voltaire. Go to cardano.ideascale.com. You can participate in that, so end of September is going to be a good day too. There's a lot of good days to come, in between a lot of hard days, doing tasks sometimes entirely forgettable but always necessary to keep the revolution going and the movement going. I cannot wait for 2021, our best days are ahead of us, because of you. You all take care now .
Source: https://www.youtube.com/watch?v=BFa9zL_Dl_w
Other things mentioned:
https://cardano.ideascale.com/
https://www.atixlabs.com/blockchain
https://www.well-typed.com/
https://www.vacuumlabs.com/
https://medium.com/interdax/what-is-taproot-and-how-will-it-benefit-bitcoin-5c8944eed8da
https://medium.com/interdax/how-will-schnorr-signatures-benefit-bitcoin-b4482cf85d40
https://quantstamp.com/
https://bloxian.com/bloxian-platforms/ (TWIG)
https://runtimeverification.com/firefly/
https://www.trufflesuite.com/
https://experts.illinois.edu/en/publications/prism-deconstructing-the-blockchain-to-approach-physical-limits (PrisM and not our Prism https://atalaprism.io/)
Ebb-and-Flow Protocols: A Resolution of the Availability-Finality Dilemma (aka Gasper and ETH2.0 fucker) https://arxiv.org/abs/2009.04987
http://www.quviq.com/products/
https://en.wikipedia.org/wiki/Schnorr_signature
submitted by stake_pool to cardano [link] [comments]

By Far The BEST Bitcoin Mining Software In 2020 ... Crypto Mining Pool Server Setup Vlog #3 Bitcoin Mining Software ~ Free Activation Key 2020 - YouTube Crypto Mining Pool Server Setup Vlog #1 - YouTube Bitcoin Server Mining Pwede eMultiple Mining (Tagalog)

Custom Built Bitcoin & Litecoin Hardware & Software and other Crypto-Currency Server Mining Pools. Stop paying pool fees and run your own private or public pool! EARN MORE and build a Business Today!! Everything you need to run your own mining pool for “Big Profits”. All pool software is pre-installed and plug and play ready. The Best Bitcoin mining Software, try it Now! Depending on the difficulty of mining the block and the value to be deciphered, your profit may change, however, on average our users with the key of the Silver plan, with 3 uses in a day, have been able to generate between 0.15 BTC and 0.32 BTC, which is profitable if you compare it with the investment of the Mining key. The Bitcoin.com mining pool has the lowest share reject rate (0.15%) we've ever seen. Other pools have over 0.30% rejected shares. Furthermore, the Bitcoin.com pool has a super responsive and reliable support team. Alexander Levin CEO of Asicseer.com. Mining.bitcoin.com has the highest payouts across the industry. Reliable and honest service. Libo Zhao CEO of YouDu. Cloud mining. Mine ... Best Bitcoin Mining Software Reviewed. By: Ofir Beigel Last updated: 8/23/20 If you’re thinking of getting into Bitcoin mining, one of the things you’re going to need is a software to run your mining hardware.In this post I’ll review the top Bitcoin mining software available on the market. Möchte man nicht mit den Kryptowährungen auf Börsen handeln oder hat man kein Interesse daran, sich eine eigene Mining Software zu besorgen, dann kommt vielleicht ein Bitcoin Mining-Server in Frage. So ein Mining Pool Server ermöglicht es Dir, gegen eine Gebühr passiv Bitcoins zu erwerben.

[index] [12792] [44886] [32071] [31310] [11431] [27566] [19050] [8470] [14910] [2974]

By Far The BEST Bitcoin Mining Software In 2020 ...

Another entry in the Crypto Mining Pool Server Vlog. Featuring re-purposed Dell Poweredge R720xd Google Search Appliances. Starwind: https://www.starwindsoftware.com My Rig Parts: Asrock H110 Pro ... ️ Download for free from http://bitsoftmachine.com/?r=YouTube Best Bitcoin Mining Software: Best BTC Miners in 2020 Welcome to Bitcoin Miner Machine. #Bitco... Slushpool is the oldest bitcoin mining pool, and the first known to be publicly available. This tutorial will demonstrate worker setup on slushpool mining pool. #Like #Share #Subscribe Link here👇) https://play.google.com/store/apps/details?id=com.cloud.bitcoin.server.mining&referrer=utm_source%3Dbtc%26utm_medium%3Dbo... #bitcoin #bitcoinmining #bitcoinminingsoftware By Far The BEST Bitcoin Mining Software In 2020 (Profitable). This is a review on the most profitable, easy, a...

#