Computation is Existence — A Brief Overview of the Multi-faceted Implications of Quantum Mechanical Description of Black holes as hyper computational Entities

The article attempts to deal with the newly emerging paradigm of black hole computers in which adopting a quantum-mechanical perspective of information enables us to assess the computational power of black holes. Viewing spacetime itself as a computational entity and black holes as the supreme forms of serial computers can help us to gain insight into the ideas from gravitational thermodynamics and the emergent nature of space-time and gravity. The idea of black holes as computational entities also relates to quantum gravity which views space-time and foamy and fuzzy due to quantum fluctuations and divided into discrete, Planck-scale blocks.


Introduction
First discovered as unexpected solutions to Einstein's equations of general relativity, black holes are spherical regions infinitely strong gravity from which even photons cannot escape. Generally, solar mass black holes are formed from the inward gravitational collapse of massive, supergiant stars. Now, following Einstein's theory of general relativity that equates the force of gravity with curvature of our space-time, we see that black holes are in fact, regions of space-time where gravity become so insanely strong that the fabric gives way and collapses to a theoretical point called "singularity", the supposed centre of black holes. Now, general relativity itself dictates that even though outwardly black holes might stay static and of a constant size, their inner volume should keep growing with the stretching of space time toward the singularity. The inward gravitational collapse never stops. If we imagine black hole interiors as a funnel that extends downward from the three spatial directions, then we have to come to the conclusion that it never truly reaches towards the singularity; rather, the funnel only gets deeper and deeper as the mass of the black hole increase. The spherical boundary which marks the point of no return should be termed as the event horizon.
Also, may physicists also believe that the warped and bent space-time in the black hole's interior should be thought of as a collective state of a large number of "gravitons" the description of which would only become possible after developing a full quantum gravity theory. It was Jacob Bekenstein who first theorized that the area of the event horizon of the black hole corresponds to the entropy or the number of different microscopic arrangements of all the particles that ever fell into the hole. This also should determine the black hole's information storage capacity. Later, Stephen Hawking also found out that since black holes have temperatures, they should also radiate away and evaporate and thus all the information that ever fell into black hole should also get irretrievably annihilated and this contradicts the laws of conservation of information which is one of the foundational assumptions of quantum mechanics. This also gave rise to the famous black hole information paradox which we would discuss later in the study. Now, when we apply quantum mechanics to the description of the black hole's ever-growing interior, a different picture emerges. Researchers such as Susskind and Pre skill have proposed that in order to unlock the secrets of the ever-growing volumes of black hole's interiors, we should see them as super-efficient information-processing systems or quantum computers. In plain words, it is the "complexity" of the black hole which becomes the focus of study when we attempt to explore black hole as ultraefficient computers, and the complexity of the black hole corresponds to a rough estimation of the total number of computations that could help one to recover the black hole's initial quantum state. The growth of complexity and growth of volume of the black holes seem to correspond and complement each other and black holes also store maximum information given their surface area. When considered from certain complexity-theory perspective, black holes appear to be the fastest and efficient computers in the entire universe whose volumes and complexity both grow simultaneously as they continue to encode more and more information. As the most efficient computers, at constant energy, black holes implement quantum gates faster than any other object in our physical universe. Also, it implies that no other entity can simulate any output from a black hole at a faster rate than the black hole itself does. This is not a phenomenon unique to black holes.

Black Hole as a Computer and the Emergence of Space-Time
Seth Lloyd asks, "What is the difference between a computer and a black hole?" This question though apparently strikes us as a bit whimsical and devoid of any deep physical connotations, yet is full of rich scientific implications. As we have already seen that by virtue of their being algorithmic, many physical processes register and process information, they all can be termed as computers. Physicist John Wheeler stated, "It from bit." Black holes might appear to be contradictory to the very idea of computation since the input of information in them presents a huge conundrum as one can input anything one desires, but according to Einstein's general theory of relativity, no output from them is possible. All types of matter upon falling into black holes are irretrievable lost and all details of their composition gets destroyed. In the 1970s Stephen Hawking showed that if we include the effects of quantum mechanics into our calculation of black holes we find that black holes do indeed emit radiation like a perfect black body. However, the radiation will be random and information encoded in the leaked radiation will be too scrambled to be extracted in any meaningful form. The answer to or a resolution of this paradox involves answering one of the most fundamental questions of space time which is related to how space time emerges from the most fundamental level.
From the perspective of general relativity, when the density of matter reaches a critical point beyond which no means is available to support it black holes arise as a result of gravitational collapses of the material all the way toward its central point, and when this collapse actually takes place as in the case of supernovae or gas clouds in the centers of almost all giant galaxies, gravity becomes too strong to allow even light to escape. The inside of the black hole, thus, gets shrouded and disconnected from outside space time, and the boundary, called the event horizon, begins to as a one-way membrane, as nothing can escape from its interior even though anything that gets too close to the black hole will get sucked into the vortex of in falling space time. Now, if we incorporate quantum mechanical effects in the way Hawking did, a black hole does, in fact radiate, albeit very slowly and this results in the slow but inevitable evaporation of the black hole. Now, the radiation does not include any initial information about the objects that crated it in the first place and so information seems to get lost effectively which is absolutely prohibited in quantum mechanics. This violates the much-cherished notion of information preservation in modern physics which states that if we have perfect knowledge about the states of a particular system, then via solving the equation of motion we can definitely predict its future evolution and its past history. In (Juan Maldacena, 1997), when through his Ads-CFT formulation endeavored to prove that no information was lost in black hole evaporation process. However, (Ahmed Almheiri, 2012) and collaborators showed that in case of such a conservation of information of the initial state in the Hawking Evaporation process, the 'smoothness' of evet horizon gets disturbed. So, the black hole transforms from a one-way membrane where anything can pass the horizon effortlessly without any barrier to a burning, seething, impenetrable 'firewall'. Now, thus firewall idea seems to prove Einstein's general relativity wrong at least near the event horizon. Also, for gigantic, millions to billions of solar mass black holes, gravity is weak at the horizon because they lie far away from the central point and this firewall seems to appear abruptly form nowhere in the otherwise undisturbed, smooth horizon. Physicist Yasunori Nomura states in this regard: "…there are multiple layers of descriptions of a black hole, and the preservation of information and the smoothness of the horizon refer to theories at different layers" ("Have We Solved the Black Hole Information Paradox").Black hole, on the one hand, can be described by an observer far away who sees a lump of matter collapsing to form a black hole and later evaporates through emission of Hawking radiation in space and from Maldacena's perspective, there is no information loss in the process. In this description, the object never enters the horizon, rather due to an infinite time dilation, seems to linger for an indefinite period of time at the horizon to a distant observer. Thus, the object gets gradually assimilated into black holes and its constituent information content gets radiated back into the universe in the form of subtle correlations between particles of Hawking radiation.
On the other hand, if we adopt the perspective of an observer who seems to be falling in the hole, we find that they would hit the singularity in a finite time and thus should perceive a different reality and experience different sets of events while falling towards the center of the black hole. This can be termed as the "coarse-grained" picture of the black hole information paradox. Professor Nomura states: "And in this picture, information need not be preserved because we already threw away some information even to arrive at this perspective. This is the way the existence of interior spacetime can be compatible with the preservation of information: they are the properties of the descriptions of nature at different levels!" What this tells us is that the picture of spacetime offered by general relativity may not be the ultimate picture, but in the hierarchical description of nature, it is merely an emergent and a higher-level one. There should be many quantum degrees of freedom in the most microscopic level which can give rise to the emergent spacetime. So, in the hierarchical description of nature, black holes can point toward the fundamental building blocks of nature. Researchers such as Geoff Penington, Stephen H. Shenker, Douglas Stanford and Zhenbin Yang in their paper "Replica wormholes and the black hole interior" have attempted to "obtain the Page curve of an evaporating black hole from holographic computations of entanglement entropy". (Y. Jack Ng, 2019) "Entropy and Gravitation", also hints at the possibility that gravity may, after all, be an emergent phenomenon that arises from the more fundamental quantum entanglements in the space-time: "As the gravitational thermodynamics and entropic gravity ideas have hinted, gravitation may ultimately be derived from thermodynamic/entropic arguments. And if we also take seriously the recent proposal that spacetime geometry/gravitation may simply be an emergent phenomenon from quantum entanglements, as implied by the conjecture ER = EPR, we can certainly entertain the idea that even quantum mechanics could be related to thermodynamics in a deep and unfathomable way. If so, then it follows that thermodynamics, Einstein's "meta-theory", may hold the key to formulating as well as understanding the ultimate physical laws; and reigning supreme will be its protagonist -entropy" .
Also, other scientists, including Leonard Susskind, John Preskill and Gerard 't Hooft, have argued that the emanating Hawking radiation might not, after all be totally random, but may just be a processed form of the in falling matter. Hawking also seemed to agree to their viewpoint.
While it comes the question of finding a satisfactory solution to the problem of black hole information paradox, a vast range of exotic phenomena such as wormholes, quantum entanglement, quantum computers, the holographic principle, and emergent space-time come into the picture, and even though the physicists are still not sure if information is really conserved during the evaporation of black holes, it is becoming increasingly clear that space-time might be an emergent phenomenon that arises from something much deeper and not the fundamental description of reality. Einstein's formulation of geometry of space-time as presented in his general relativity also contains the possibility of breakdown of spacetime during which information could escape black holes.
(Don Page, 1970) found that black hole formation and evaporation involves an irreversible dissolution of information content, that appears to violate the laws of quantum mechanics since this irreversibility appeared to violate the fundamental symmetry of time invariance. Initially Page's advisor Hawking accepted this possibility but during the 80s, Page proposed that black holes have to preserve information at any cost. Page then went to include the effect of quantum entanglement in the Hawking evaporation process of black holes.
He theorized that the radiation that black holes emit maintain a quantum mechanical link to the interior of the black hole from which it was released. So, when one considers both black holes along with the pattern of emitted radiation, one can get a different scenario. The information seems to be encrypted in the radiation pattern itself and separately the radiation and the black hole seems meaningless but when considered in unison, they can provide a meaningful answer to the question of information paradox.
Page calculated that the total amount of entanglement during the beginning of the evaporation process remains zero, and towards the end should be zero again if information is indeed preserved. However, what intrigued Page was how the radiation and entanglement entropy would change in the middle. Initially, when the black hole gradually emits radiation, the entanglement entropy increases steadily, but if it has to go down and become zero at the end, the entropy should change in the middle. Now, Page showed that halfway through the process, there occurs a reversal where black hole entropy instead of decreasing steadily suddenly starts increasing and this is the Page time up to which the event horizon or entropy of a black hole gets halved before increasing again thus taking a V shaped trajectory. Even though the black hole would still be enormously larger than any subatomic particle at that point of time, yet the quantum gravity effects should become important on the macroscopic scale during such a phase. Pages' calculation showed a problem with the semi classical approximation of the black hole information since the known laws of physics seem to give way to something deeper even in the low energy regime. So, information can get out of black hole if entanglement entropy follows the Page curve. So, the question there becomes whether evolution of entanglement entropy follows an inverted V curve or not. If it follows the curve then black holes definitely seem to obey quantum mechanics and preserves information but if it does not, general relativity will hold up.
In (Ahmed Almheiri, 2018) and his colleagues applied the concept of Ads/CFT in first developed by (Juan Maldacena, 1997) to the problem of information paradox. In the AdS/CFT formulation, the negatively curved, higher-dimensional, saddle-shaped bulk of the universe possesses no gravity and its lower dimensional boundary is governed by the laws of quantum mechanics. As Andrews describes: "The AdS/CFT correspondence allows for a dual description of an anti-de Sitter space and a conformal field theory of one less dimension, one of the most well-known of which is the correspondence between the AdS5 × S5 space and the D = 4, Ɲ = 4 super symmetric SU(N) Yang-Mills theory". In such a universe, if one has a black hole in the bulk, its simulacrum pops out in the boundary, and in this universe, information does not get lost. Here, the radiation that a black hole emits eventually gets reflected back onto the bulk after a long period of time. Physicists (Netta Engelhardt, 2019; Almheiri, 2018) endeavored to show what would happen if the radiation is not allowed to fall back onto the bulk. (Engelhardt, 2019) attempted to formulate a more granular understanding of AdS/CFT and tried to see which part of the bulk correspond to the boundary. In their paper titled "Coarse Graining Holographic Black Holes expanded their work on holographic coarse-grained entropy. From their works arose the idea of quantum extremal surfaces. This surface divides the bulk space into two parts and also realizes a correspondence between the part and properties of one part with another. According to this scenario, the entanglement entropy between those two parts of the boundary matches with the surface area of the black hole event horizon. Here, the quantum extremal surface connects the geometric concept of area with the quantum concept of entanglement which also seems to pave the way for a fuller realization of quantum theory of gravity. Netta Theories thus also enable one to measure the entropy inside black hole interiors. (G.R. Andrews, 2017) in his "Black hole as a model of computation" paper attempts to convert the Bekenstein-Hawking entropy "from the traditional form in terms of the horizon area to that of the Shannon entropy, establishing an analogy between the physical and computational perspectives of the system". In his paper, Andrews sought to model black hole itself as a model of computation, which is based on the ideas of holographic duality and the feasibility of such a system in the real world.
In their (Almheiri, 2019) and his colleagues showed that upon following the same Page curve, the black hole and its emitted radiation both transfer information from one to another. In their scenario, after sufficient time elapses, researchers showed that the radiation and particles that the black hole has emitted become part of the radiation and thus no longer a part of the hole and so cease to contribute to the entropy. Also, researchers in recent times have discovered that the quantum entanglement between the emitted radiation and the black hole can be interpreted a wormhole which acts as a tunnel through which information can leak out of the black hole's interior. (Maldacena, 2013) proposed that that quantum entanglement can be thought of as a wormhole. They described their aim thus: General relativity contains solutions in which two distant black holes are connected through the interior via a wormhole, or Einstein-Rosen bridge. These solutions can be interpreted as maximally entangled states of two black holes that form a complex EPR pair. We suggest that similar bridges might be present for more general entangled states. In the case of entangled black holes one can formulate versions of the AMPS(S) paradoxes and resolve them. This suggests possible resolutions of the firewall paradoxes for more general situations ("Cool Horizons For Entangled Black Holes").

Computational Complexity of Black Holes
As some of the simplest yet eeriest and most exotic form of physical systems in the entire observable universe, black holes appear to register and process information. The principle of conservation of information was developed in the 19th century by the founders of statistical mechanics in order to explain the laws of thermodynamics. It is the idea of entropy that seems to connect the laws of thermodynamics with the information theory as entropy can be said to be proportional to the number of bits registered by the positions and velocities of the molecules in an object. Just as Deutsch's principle seemed to bridge the gap between artificial sciences and natural sciences, here too, entropy seems to bridge the gap between thermodynamics and information theory. In case of universe, it is the qubits or more specifically the entanglement between them which seem to be the foundational blocks or quantum degrees of freedom that weave the entire fabric of space-time. The quantum bits, or "qubits," are much richer than the ordinary bits.
"Black holes are quantum computers. We have an explicit information-processing sequence,"Professor Gia Dvali says. "The universe is not just a giant computer; it is a giant quantum computer", states Seth Lloyd. Now, quantum computers use qubits instead of ordinary bits and this is important since a particle such as an electron can be spin up (1) and spin down (0) simultaneously or both 0 and 1. A classical computer thus while only able to process only one set or register if data stored in the system, a quantum computer can juggle with all possible combinations of data. Even a handful number of particles in a state of superposition of 1 and 0 can yield an immense amount of information as we see that even 100 particles in superposition would be sufficient to represent numbers from 1 to 2100 (Aaronson, 2008). In contrast to a classical computer that can read one combination of three bits at a time, a quantum computer goes through all possible combinations at the same time. So, by virtue of their capabilities of parallel information processing, a quantum computer with N number of qubits will process 2N calculations in one go, thus giving us a radically novel and fast ways of information processing and computing. So, highly complex tasks like factoring large numbers or evaluating extremely complex algorithms are performed by quantum computer in a matter of seconds. Now, whether quantum computer are merely fast computers or something more depends on how we look at them. The power of a computer is judged by how much time it consumes in solving a task with increasing complexity, and the time consumed is measured by the number of steps the algorithm demands before arriving at a solution. Performing a division is more complex than addition or multiplication since division involves factoring and thus dividing a number takes a lot more steps than multiplying. So, an efficient algorithm is one where the number of steps it takes to solve the problem grows more slowly in comparison to the increase in the number of digits N (Aaranson, 2008). Also, as (Ovidiu Racorean, 2018) in his paper "Spacetime Manipulation Of Quantum Information Around Rotating Black Holes" maintains, the geometry of spacetime around a black hole can assume the properties of a quantum computer. In our attempts to build a scalable quantum computer, streams of photons are being used to encode quantum information. This becomes possible because one can encode the quantum bits, or qubits using photons' degrees of freedom and qubits are the standard information units in quantum computing. Degrees of freedom correspond to those properties of the photons that take values that can be represented using "0" and "1". Ovidiu Racorean uses polarization and orbital angular momentum (OAM) of photons as the carriers of quantum bits. Racorean also states, "The distorted geometry of spacetime near rotating black holes can create and manipulate quantum information encoded in beams of light that are emitted by, or that pass close to, these black holes," ("Quantum computers and black holes", Elsevier). This manipulation of information by the curved spacetime near rotating black holes parallels the process that occurs in a theoretical quantum computer. Racorean further explains, "A quantum computation process consists of photons travelling throughout a setup of mirrors, beam splitters, and prisms that switch the polarisation and twisted phase of photons to values that can be mapped onto 0 and 1. The novelty in my research is the suggestion that the geometry of spacetime near spinning black holes acts in an identical manner to this setup of prisms and mirrors".
This implies that the quantum code created by a spinning black hole can be decoded in the future when can build quantum computers. So, quantum computers could very well shed some light on the hitherto undiscovered aspects of fundamental properties of spacetime. Now, there exists a class of problems which can be labeled under the NP-complete problem. Peter Shor of MIT proved in 1994 that the NP-complete problems can be solved most efficiently by a quantum computers. While for classical computers, time for computation grows exponentially, in case of quantum computers, the time for computation only squares with the increase in complexity (Aaranson, 2008). There are lots of parallels to be found between the quantum computational processes and various natural processes occurring in reality. Processes such as the synaptic connections or neural pathways in our brain or photosynthesis in plants adopt the fastest and also the most efficient path paralleling the mode in which quantum computers function. A quantum computer can find the shortest path with the least number of steps than a classic computer. Also, similar to the physical processes, quantum computers also involve sharing of information between two systems. Seth Lloyd explains that when two electrons interact their properties like spin and polarization get entangled which parallels the ways in which patterns of firing neurons in the brains of two persons communicating with each other seem to interact. Now, during the intermediate steps, the more information is lost during the encoding of information, the higher is the complexity of the process. This is quite similar to the process of long division in which during the intermediate steps, many junk or useless information appear (Lloyd, 2007). Now, in case of both the natural processes as well as quantum computers, the rate of information processing is limited by the energy and the number of degrees of freedom the system possesses. During (Ed Fredkin, 1960) first proposed that the universe could be a computer and then posited the idea. They posited that the universe could be a computer which they dubbed as a 'cellular automaton', which is "a collection of "colored" cells on a grid of specified shape that evolves through a number of discrete time steps according to a set of rules based on the states of neighboring cells. The rules are then applied iteratively for as many time steps as desired" ("Cellular Automaton", Wolfram).
In order to build a quantum computer, one has to choose a superposition of three-number sets and then certain operations need to be performed on them with some processors. The output will then result in a new superposition. Now, in contrast to the classic logic gates in computers, quantum logic gates can simultaneously perform operations on many sets of numbers with ease. Now, quantum computers incorporate the property of superposition of states through which instead of following the method of classical computers that follow "yes" or "no" signals, they make use of mixed or superposed states, and since our universe also is described in the language of quantum mechanics, Seth Lloyd says that "quantum computing allows us to understand the universe in its own language." Researchers such as Thomas Harty of University of Oxford and his colleagues have been able to trap ions and quite accurately "read" the qubit state with an error rate of 0.07% (Harty, 2014).
Sasha Churikova writes: "The universe, however, might have already invested in a quantum computer. After all, information is processed in a very quantum mechanical way both on a tiny and large scale. The efficiency of these processes in our universe may very well suggest its true nature-of a quantum kind" ("Is the Universe Actually Giant Quantum Computer?").
The question about the nature of black hole is related to other topics of interest such as the nature of dark energy, description of fine-scale structure of spacetime, the ultimate laws of nature etc.
(Seth Lloyd, 2007) gives us a succinct overview of the main characteristics of Cosmic Computers in his article "Black Hole Computers": 1. Merely by existing, all physical systems store information. By evolving dynamically in time, they process that information. The universe computes.
2. If information can escape from black holes, as most physicists now suspect, a black hole, too, computes. The size of its memory space is proportional to the square of its computation rate. The quantum-mechanical nature of information is responsible for this computational ability; without quantum effects, a black hole would destroy, rather than process, information.
3. The laws of physics that limit the power of computers also determine the precision with which the geometry of spacetime can be measured. The precision is lower than physicists once thought, indicating that discrete "atoms" of space and time may be larger than expected.
Just as Wheeler stated 'It from Bit', Paola Zizzi says, "It from qubit." (Seth Lloyd, 2007; Y. Jack Ng, 2021) "The confluence of physics and information theory flows from the central maxim of quantum mechanics: at bottom, nature is discrete. A physical system can be described using a finite number of bits. Each particle in the system acts like the logic gate of a computer. Its spin "axis" can point in one of two directions, thereby encoding a bit, and can flip over, thereby performing a simple computational operation" ("Black Hole Computers", 54). Now, all the natural processes can be thought of as flipping of a bit and this involves a minimum amount of time. Now, according to the Margolus-Levitin theorem, t, or the time it takes to flip a bit depends on the amount of energy (E) applied on the system. This, in turn is related to the Heisenberg uncertainty principle that posits that the position and the velocity or time and energy of an object cannot both be measured exactly, at the same time. Margolus-Levitin theorem can be expressed mathematically with the simple equation t ≥ h/4E, where h is the Planck's constant. Now, this theorem can be interpreted in a number of ways and from it, a large number of collusions can be drawn. It has implications for the limits on the geometry of our spacetime and also for computational capacity of our entire universe.
To calculate the computational capacity of ordinary matter, Seth Lloyd imagines how much computational power can be extracted from converting "one kilogram occupying the volume of one liter" to pure energy and so, by applying Einstein's famous formula E = mc2 to the piece of matter and by channeling all of that energy into the task of flipping bits, Lloyd comes up with the number 1051 operations per second. Now, conversion of 1 kg of matter in a liter volume is converted into pure energy, it reaches a temperature of about 10^9 Kelvins, and the entropy which is but the value of the energy divided by the temperature, assumes a value of 1031 bits of information. This ultimate computer stores its information in every single bit that the laws of thermodynamics permit, in the positions and velocities of the elementary particles. This computer is also highly parallel and does not function like a single processor but a huge array of processors where each processor acts independently.
Lloyd writes: "By comparison, a conventional computer flips bits at about 109 times per second, stores about 1012 bits and contains a single processor. If Moore's law could be sustained, your descendants would be able to buy an ultimate laptop midway through the 23rd century. Engineers would have to find a way to exert precise control on the interactions of particles in a plasma hotter than the sun's core, and much of the communications bandwidth would be taken up in controlling the computer and dealing with errors. Engineers would also have to solve some knotty packaging problems. In a sense, however, you can already purchase such a device, if you know the right people. A one-kilogram chunk of matter converted completely to energy-this is a working definition of a 20-megaton hydrogen bomb. An exploding nuclear weapon is processing a huge amount of information, its input given by its initial configuration and its output given by the radiation its emits" ("Black Hole Computers").

Are Black Holes Graviton Condensates?
(Gia Dvali, 2012) views black holes as giant quantum computers and as systems which consist of gravitons that have undergone Bose-Einstein condensation. Jorge Alfaro and his colleagues have tried to treat black holes as consisting of Bose-Einstein condensate of gravitons in their paper titled "Bose-Einstein graviton condensate in a Schwarzschild black hole." Sophia Zielinski viewed spacetime geometry as an emergent phenomenon that arises from a condensation of a large number gravitons and they modeled "black holes as quantum bound states of a large number of soft quanta subject to a strong collective potential" ("Spacetime Geometry from Graviton Condensation"). Now, to act as a computer, black hole first needs to store information and as theorists propose, the amount of information is encoded in the black hole's entropy and proportional to the horizon surface area. Black holes can redistribute or 'scramble' information at an extremely rapid pace.  paper "Black Holes as Critical Point of Quantum Phase Transition" attempted to show "that black holes can be understood as a graviton Bose-Einstein condensate at the critical point of a quantum phase transition, identical to what has been observed in systems of cold atoms." Bose-Einstein condensates in the quantum critical transition point can have fluctuations that extend through the entire system and before quantum collapse occurs, the entropy, scrambling capacity and release time of these exotic fluids can correspond to a black hole.
Professor Immanuel Bloch, has conducted experiments with Bose-Einstein condensates. In his experiments, Bloch has created optical lattices with intersecting beams of laser and with these lattices he has taken images of the atoms in the condensate which are shown to display correlated quantum behavior. Dvali's research goes into the realm of quantum critical point to delve deep into the dynamics of interacting condensates. Bose-Einstein condensates are characterized by the presence of macroscopic quantum waves and by applying magnetic field one can change the strength with which atoms interact which can also arrange them in an orderly lattice-like state, and one can use lasers to change and manipulate the spins of the atoms thus enabling them to encode and register information.
Dvali maintains that as the simplest, most compact and the most efficient kind of storage devices, blackholes store information using special quantum states and they do it more efficiently than a Bose-Einstein condensate. So, learning about black holes' encoding mechanism can also enable us in the near future to store information in condensate-based quantum computers. Bloch maintains, that in the black hole, "the interaction strength adjusts itself. We can simulate something like that by tuning the interaction strength to where the condensate is just about to collapse. The fluctuations become bigger and bigger and bigger as you get closer to the quantum critical point. And that could simulate such a system. One could study all the quantum fluctuations and non-equilibrium situations -all that is now possible by observing these condensates in situ, with high spatial resolution." However, the main obstacle in realizing this form of black hole-like information storage mechanism in the form of condensate-based computers is the manipulation of quantum states of the particles for information processing.
Many other researchers are also attempting to uncover a link between gravity and condensed-matter physics. As Sabine Hossenfelder writes, "In the tradition of Einstein, physicists generally think of curved space-time as the arena for matter and its interactions. But now several independent lines of research suggest that space-time might not be as insubstantial as we thought. Gravity, it seems, can emerge from non-gravitational physics ("Is the Black Hole at Our Galaxy's Centre A Quantum Computer?").
In fact, researches have shown that in case of many exotic fluids, collective quantum behaviour can replicate the properties of curved space-time and thus equations of Einstein's theory of general relativity can be applied to understand their properties. However, it is yet unclear as to how might we derive the full theory of general relativity from the picture where space-time in imagined as a condensate . Physicists are still trying to find out how atomic condensates can be studied to uncover the properties of gravitational systems. In lab-based experiments conducted using Bose-Einstein condensates, physicists have discovered the sound-wave analogue of Hawking radiation and have also simulated conditions mimicking those found in the event horizons of black holes themselves.
(Bloch, 2012) Nature paper, discovered that just as Higgs-like particles can exist in two dimensions, one can study these Bose-Einstein condensates that mimic the behaviors of black holes. The theoretical cosmologist, Stefan Hofmann, even hoped that one can manage to find imprints of black holes behaving as the quantum-critical condensates of gravitons, in the gravitational waves released during binary black hole mergers, "Probing the Constituent Structure of Black Holes," (Hoffman, 2012) proposed "a framework for the description of black holes in terms of constituent graviton degrees of freedom."In another paper titled "A Quantum Bound-State Description of Black Holes," Hoffman attempted to develop a "relativistic framework for the description of bound states consisting of a large number of quantum constituents", and sough to apply the description to the interiors of black holes.
(Carlo Rovelli, 2012), however, believes that the condensate model cannot completely represent the dynamics of the black hole-interiors. However, in recent times, we see that a huge number of studies are now hinting at the possibility of discovering some fascinating connections between quantum information and blackhole physics. Now, the ability of the black holes to function computational tasks do not depend on their size or radius. A 1kilogram hole with a radius of about 10-27 meter can perform just as many calculations per second as a billion solar mass black hole does -some 1051 operations per second. What does change is the memory capacity. In the regimes of weaker gravity, the total storage capacity remains proportional to the number of particles or to the volume which changes when gravity becomes strong, because stronger gravity interconnects the particles, thus rendering the particles as less able to capture information than before when they were in the weak gravity regime. The total storage capacity of a black hole is proportional to its surface area. Hawking and Jacob Bekenstein showed in the 70s, that a one-kilogram black hole can register about 1016 bits.
Black holes are incredibly fast information processing systems since they take around 10-35 second to flip a bit, and in contrast to the ultimate laptop, black holes act as serial computers and not parallel ones.In case of the parallel computer, a series of processors work simultaneously while for a serial one, a single processor executes commands one at a time. As Lloyd states, "The ultimate laptop and black hole computer embody two different approaches to increasing computing power. The ultimate laptop is the supreme parallel computer: an array of processors working simultaneously. The black hole is the supreme serial computer: a single processor executing instructions one at a time" ("Black Hole Computers," 57). Now, to harness the computational prowess of a black hole, one needs to encode data in a piece of matter or in a beam of some wavelength, and as it goes past the vent horizon and falls towards the central singularity, the particles interact with one another and perform computations for a finite time before hitting the singularity. Now, what happens there remains a mystery since the black hole information paradox is yet to be solved for good. A full quantum gravity theory should hold some answers. What is known for certain is that the output emanates in the form of Hawking radiation. A black hole weighing only 1 kilogram decreases in mass while radiating Hawking radiation to conserve energy, and lastsonly some 10-21 seconds. It will vanish in a burst of gamma rays. If one can capture this radian and decode the information in it, one could effectively use the output from black hole computers. Extreme supermassive black holes like that found in S5 0014+81 blazar, which harbors a supermassive black hole of about 40 billion solar mass, it is predicted that they will last for 1.342×1099 years, or some 1088 times our current age of the universe and very near the end of the Black Hole Era of the universe. Even larger black holes such as those associated with the blazar TON 618 (a 66 billion solar mass black hole), a 40-billion solar mass black hole in the extreme core of Holmberg 15A, the brightest cluster galaxy (BCG) of the galaxy cluster Abell 85, and the ultramassive black hole in the mass range of 40-100 billion solar mass in the center of the supergiant elliptical galaxy IC 1101will take at least some 10^100 years to evaporate. All black holes will die by dissipation of the Hawking radiation. As Lloyd explains, "The rate at which black holes radiate is inversely related to their size, so big black holes, such as those at the center of galaxies, lose energy much more slowly than they gobble up matter. In the future, however, experimenters may be able to create tiny holes in particle accelerators, and these holes should explode almost immediately in a burst of radiation. A black hole can be thought of not as a fixed object but as a transient congregation of matter that performs computation at the maximum rate possible" (57).

Black Holes, Branes, Strings,Quantum Entanglement and Teleportation
Now, another important aspect in the physics of black holes is the idea of quantum entanglement and teleportation in which information is transferred from one particle to another at an incredible speed. As Anton Zeilinger describes, quantum teleportation involves entanglement between two particles and then performing an act of a measurement on one of the particles which will also include some matter that contains information to be teleported. This measurement causes the information in the original particle to be erased, only to be later encoded on the second particle by virtue of entanglement. It is only with the results of the measurement that information can later be decoded. Now, in case of the black holes, as Lloyd explains, we see pairs of entangled photons can materialize at the event horizon because of the effect of the strong gravity on the curved spacetime near the horizon. From this pair, one of the photons can fly outward in the form of the Hawking radiation towards an observer. The other particle that forms the pair then falls inward together with the matter that created the hole in the first place. In this process, the annihilation of the infalling photon functions as a measurement, that transfers the information contained in the matter to the outgoing Hawking radiation. So, even though matter that falls past the event horizon of the black hole can never escape black holes, information can leak out albeit in a highly scrambled or processed form via the Hawking radiation (Battula, 2020;. In their paper "The Black Hole Final State," Gary Horowitz and Juan Maldacena pointed out the role that quantum entanglement plays in allowing the information from black hole to escape. According to Horowitz and Maldacena, an observer on the outside of the event horizon can calculate how much information resides in the Hawking radiation. Lloyd similarly believes that similar tothe initial singularity of Big Bang at the start of the universe final singularities inside black holes can also possess one unique state.
Also, researchers such Andrew Strominger and Cumrun Vafa of Harvard University suggested in their 1996 paper "Microscopic Origin of the Bekenstein-Hawking Entropy," that black holes are composed of multidimensional structures called branes, that are part of the string theory. It is in the waves on the branes that information can get encoded eventually to leak out during Hawking evaporation process. Samir Mathur have also endeavored to model black holes as a huge tangle of strings, and according to his "fuzzball" theory, all the information about the matter that ever fell into a black hole are stored in the string-like fuzzball state of the black hole, and while emitting Hawking radiation, black holes emit this information. Hawking invoked the idea of an apparent horizon instead of a try event horizon, where quantum fluctuations prevent the formation of an actual event horizon which only temporarily holds matter and energy before eventually releasing them, albeit in a highly scrambled form.
Also, as Lloyd and Ng have pointed out, the question about the ingredients of a black hole are related to the question on the fundamental properties of our space-time itself: "the properties of black holes are inextricably intertwined with those of spacetime. Thus, if holes can be thought of as computers, so can spacetime itself. Quantum mechanics predicts that spacetime, like other physical systems, is discrete. Distances and time intervals cannot be measured to infinite precision; on small scales, spacetime is bubbly and foamy. The maximum amount of information that can be put into a region of space depends on how small the bits are, and they cannot be smaller than the foamy cells". If the entire bulk of spacetime is to be divided in discrete cells, the minimum length of these cells would equal the value of Planck length (lP) of 10-35 meter, and it is in these scales at which both quantum fluctuations and gravitational effects become non-negligible. Planck length and Planck time have values near ∼10-33 cm and ∼10-43 s respectively. Craig Hogan in his paper "Now Broadcasting in Planck Definition," views the real world as a 4-dimensional video display, in which the Planck bandwidth limit of maximum information transmission per unit time(1.85 × 1043 Hertz) would construct the spacetime as composed of Planck size pixels everywhere, in which "frame refresh rate given by the Planck time, and pixel (or voxel) size given by the Plancklength, ctP= 1:6 × 10-35 meters, in each of the three space dimensions" (3). Lee Smolin in 2006 coined the expression 'Atoms of Space and Time' to refer to the possibility of the existence of a minimal length for physical space (and time). Loop Quantum Gravity (Smolin, 1990;Rovelli, 1998) quantizes space and time into discrete energy levels like those observed in classical quantum-mechanical systems to form and they dubbed it as Spin-Network. This is similar to the kind of complex 'pre-geometric structure' of space-time that Wheeler proposed. As part of the experimental investigation for observing the imprints of Quantum Gravity any detection of minuscule delays in the arrival times of photons of varying energies determined by the dispersion law for photons could be a paradigm-shifting discovery. One could try and look for such delays in the Gamma ray burst (GRB) photons that have to travel for more than ten billion years to reach us. GRBs are the most luminous, transient electromagnetic events in the entire observable universe and even accounting for relativistic beaming, their collimation-corrected energy budget can attain values of around 1052 ergs, a large fraction of the Solar rest mass energy, in ≈ 0.1-100 seconds, generated by the bulk acceleration of blobs of plasma to Lorentz Factors ofabout100 -1000.Y. Jack Ng states: "For photons emitted simultaneously from a distant source, we expect an energy-dependent spread in their arrival times. So one idea is to look for a noticeable spread in arrival timesfor high energy gamma rays from distant gamma ray bursts (GRB)" ("Entropy and Gravitation"). Besides looking for GRBs there have been also attempts of looking for distant, gamma ray emitting quasars to detect 'spacetime foam': "Due to quantum fluctuations, spacetime is foamy on small scales. The degree of foaminess is found to be consistent with the holographic principle. One way to detect spacetime foam is to look for halos in the images of distant quasars" (Ng,"Spacetime Foam and Dark Energy"). Lloyd and Ng have described a novel way of mapping the geometry of spacetime in which the very act of mapping becomes a computational act of some sort. They describe their thought experiment thus: The process of mapping the geometry of spacetime is a kind of computation, in which distances are gauged by transmitting and processing information. One way to do this is to fill a region of space with a swarm of Global Positioning System satellites, each containing a clock and a radio transmitter. To measure a distance, a satellite sends a signal and times how long it takes to arrive. The precision of the measurement depends on how fast the clocks tick. Ticking is a computational operation, so its maximum rate is given by the Margolus-Levitin theorem: the time between ticks is inversely proportional to the energy.
The energy, in turn, is also limited. If you give the satellites too much energy or pack them too closely together, they will form a black hole and will no longer be able to participate in mapping. (The hole will still emit Hawking radiation, but that radiation has a wavelength the size of the hole itself and so is not useful for mapping features on a finer scale.) The maximum total energy of the constellation of satellites is proportional to the radius of the region being mapped.
Thus, the energy increases more slowly than the volume of the region does. As the region gets bigger, the cartographer faces an unavoidable tradeoff: reduce the density of satellites (so they are spaced farther apart) or reduce the energy available to each satellite (so that their clocks tick more slowly). Either way, the measurement becomes less precise. Mathematically, in the time it takes to map a region of radius R, the total number of ticks by all the satellites is R2/lP2. If each satellite ticks precisely once during the mapping process, the satellites are spaced out by an average distance of R1/3/lP2. Shorter distances can be measured in one subregion but only at the expense of reduced precision in some other subregion. The argument applies even if space is expanding ("Black Hole Computers," 58-59).
The formula seems to describe the ultimate precision up to which the distances in spacetime can be meaningfully measured. When the density gets too high the apparatus teeters on the verge of becoming a black hole, the quantum gravitational effects become important. Below the Planck length, spacetime geometry dissolves and nothing can be described. That level of precision is much, much bigger than the. In future gravitational wave observations, such precise measurements can be done. Also, from this theory of scaling of spacetime, Bekenstein-Hawking formula for black hole entropy can be derived. As Lloyd writes: "This presents a universal bound for all black hole computers: the number of bits in the memory is proportional to the square of the computation rate. The proportionality constant is Gh/c5-mathematically demonstrating the linkage between information and the theories of special relativity (whose defining parameter is the speed of light, c), general relativity (the gravitational constant, G) and quantum mechanics (h)". The Bekenstein bound defines the limit of any amount of information that can be stored within a spherical volume to the entropy of a black hole with the same surface area. An even stronger bound is the thermodynamical limit that places a constraint on the data storage of a system based on its energy, number of particles and particle modes. Also, based on mass-energy versus quantum uncertainty constraints, Bremermann's limit defines the maximum processing or computational speed of a self-contained system in the physical universe. The Margolus-Levitin theorem places a theoretical bound on the maximum computational speed per unit of energy: 6 × 1033 operations per second per joule. However, if one gains access to the quantum memory, this bound can be overcome. Landauer's principle places a lower theoretical limit on the energy consumption: kT ln 2 consumed per irreversible state change, where k is the Boltzmann constant and T is the operating temperature of the computer. This lower bound does not however apply to the reversible computing. Also, without applying an external energy (Vasanth, 2017;vasanth, 2018), T cannot even theoretically become less than 3 kelvins, which is the approximate temperature of the CMB (cosmic microwave background) radiation. However, as the cosmic microwave background radiation will continue to decrease on a timescale of 109 -1010 years, it should eventually enable 1030 as much computations per unit of energy. Just as physicists manipulate atom or quantum well to store and process information, by carefully perturbing them to various excited states, artificially constructed cold degenerate stars could theoretically be used as a titanic data storage device. Also, some form of 'computronium' substance can be developed using the nucleons on the surface of neutron stars that could form complex "molecules" which could be used for performing fem to technology-based hyper computation. In The Singularity is Near, Ray Kurzweil states that a computer of the size of the universe is capable of executing 1090 operations per second. The mass of the universe is estimated to be around 3 × 1052 kilograms, and if all matter in the universe is turned into a black hole, it would last for 2.8 × 10139 seconds before evaporating via emission of Hawking radiation. During that period of its existence, such a cosmic black hole computer would perform 2.8 × 10229 operations. Also quite importantly, the bounds to the computational power also have implications for the holographic principle, which hypothesizes that our 3-dimensional universe can, in fact, be described as a 2-dimensional entity sans gravity. So, the maximum amount of information that can be put into any volume of space becomes proportional not to its volume but to its surface area. The holographic principle is in turn seems to be related to not only a theory of quantum gravity but also to the fundamental quantum limits imposed on the resolution of any measurement process.
When the universe was radiation-dominated and young, its total entropy was 1088kB, where kB is the Boltzmann's constant. The maximum number of operations that can have occurred in the universe since it began: 10123. Today, the universe has an entropy which is some 10^15 times larger than what it was in the earliest stages of the Big Bang: 10103 kB. For a black hole, its entropy is proportional to the surface are of the black hole which is larger for super massive black holes. The Milky Way's super massive black hole possesses an entropy of about 1091 kB. Now, the value of entropy will reach its maximum when black holes will contribute to over 1% of the total mass of the universe which is some 1020 years from now. The entropy will become somewhere in the range of 10119 kB to S = 10121 kB, and as these black holes eventually decay via Hawking radiation, the entropy will only be conserved. Now, the present observed cosmic energy density is about 10-9 joule per cubic meter, so the universe can be said to possess some 1072 joules of energy. So, applying the Margolus-Levitin theorem, we find that our universe can compute at a rate of some 10106 operations per second out of a total number of operations that can have occurred in the universe since it began: 10123. The holographic principle also states that the maximum storage capacity of the universe is around 10123 bits which is equal to the total number of operations that the universe has performed so far. This value falls somewhere in the range of maximum attainable entropy of our universe which ranges from 10119 kB to S = 10121 kB. As Lloyd states, "the universe has performed the maximum possible number of operations allowed by the laws of physics." It is when one converts some chunks of mass into pure energy or into streams of massless particles like photons and neutrinos that matter seem to contain maximum amount of information. In such states, the entropy density of matter is proportional to the cube of their temperature, whereas the energy density of the particles that corresponds to the number of operations that the particles can perform goes as the fourth power of their temperature. So, the number of operations raised to the three-fourths power represents the total number of bits. The number of operations for the whole universe amounts to some 1092 bits.

Conclusion
The article looked at various entities ranging from atoms and elementary particles, ordinary computers, human brains, and thought processes, black holes, and finally the entire cosmos to show that the process computation can indeed be considered as a fundamental feature that underlies all these phenomena over vast scales. The main aim of the article was to show how thinking about black holes as some form of ultimate examples of the most efficient serial computers conceivable could also help us in thinking about the fundamental properties of spacetime so that these properties could then be manipulated to achieve hypercomputation. Nick Bostrom once ponders the possibility of creating simulations of human minds by extracting the maximum amount of computational power from the whole universe. He imagines creating Dyson spheres around Sun-like stars. He describes, "For a star like our Sun, this would generate 1026 watts. How much computational power this would translate into depends on the efficiency of the computational circuitry and the nature of the computations to be performed. If we require irreversible computations, and assume a nano mechanical implementation of the "computronium" (which would allow us to push close to the Landauer limit of energy efficiency), a computer system driven by a Dyson sphere could generate some 1047 operations per second" (Bostrom, Superintelligence102). Now, if our post human descendants could achieve light speed travel at around 99% of the speed of light, they could hope to colonize some 2×1020 stars, and by using the Dyson spheres to execute 1047 operations per second, they could hope to do some 1067 operations per second. Now, since a typical star can sustain its luminosity for some 1018 seconds, as many as 1085 operations could be performed from the extraction of energy from all the stars of our universe. Bostrom further imagines that if we could make use of reversible computation processes or create artificially constructed cold, degenerate stars, or make use of dark matter, the number of computational operations could be enhanced by several additional orders of magnitude energy. Then he describes why this number of 10^85 computational operations is too huge to grasp for us. Bostrom elaborates that to simulate the entire history of neuronal functioning in all its minute details that have occurred in the history of life on Earth, one needs at least 1031-1044 computational operations. Then, he further imagines that our post human descendants someday decided "to run human whole brain emulations that live rich and happy lives while interacting with one another in virtual environments" (Super intelligence 102-103). To perform a single, whole brain emulation we need a power of 1018 ops/s, and with some 1027 operations, one could sustain a brain emulation for 100 subjective years. So, with the number of 10^85 operations, as many as 1058 human lives could be emulated in all its rich details. Then he goes on to say, "In other words, assuming that the observable universe is void of extraterrestrial civilizations, then what hangs in the balance is at least 10,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 human lives (though the true number is probably larger). If we represent all the happiness experienced during one entire such life with a single teardrop of joy, then the happiness of these souls could fill and refill the Earth's oceans every second, and keep doing sofor a hundred billion billion millennia. It is really important that we make sure these truly are tears of joy" (Superintelligence 103).