DJ sits in for Poof this week:
DID YOU KNOW?
What a GCR would actually look likeFor the sake of argument, let’s say a GCR has occurred. What would that look like? Taking into consideration all the elements the ongoing narratives have spewed for years, are implemented. QFS is operational, asset holders, both currencies and bonds, have been notified with instructions on how to proceed and funds are being distributed. How would the funds actually move through the financial system? How will you be notified? What would actually happen to economies and the world, when it occurs?
Traditional computers use “bits”. A bit (binary digit) is the smallest unit of data that a computer can process and store. A “bit” is always in one of two states, (like an on/off light switch). The “state” is represented by a single binary value, usually a 0 or 1. The “state” might also be represented by yes/no , on/off or true/false. Quantum computers use “qubits” (quantum bits), which can exist as 0 , 1, or both at the same time , allowing unlimited data, that lies between the 0 and the 1, to process and store. Meaning information moves a lot faster than traditional computing.
So money is available and is moving . Reputedly through the QFS. The only ones who actually have quantum computers are the major banks at the top of the food chain. They hold the money to be distributed!. But they have to move the money down the food chain through traditional computers, so the average person can access the funds at their local bank. Converting quantum information into a classical format has its own set of difficulties. Quantum systems are inherently delicate, often struggling with information loss, a problem classical systems generally don’t have.To convert the qubits ( That hold vast amounts of bits) into bits ( that are 1’s and O’s, true/false, yes/no or on/off stored data) each qubit has to transfer the store data (which could be millions of bits) into smaller pieces of data that a legacy system can process, this is where data can quickly become chaotic, limiting memory efficiency or losing the data.