MY ACCOUNT | CONTACT | FAQ 
Search :
WHAT I DOWORK WITH METORIDION PROJECTPUBLICATIONS


Toridion Project

Welcome to the official homepage of The Toridion Project. A quantum software research project with a simple aim:

 

To develop a new model to aid the creation of a dependable, reversible algorithm that allows the lossless compression of any input data set (I) of any length such that the generated output data (O) is consistently O<I.

Anyone familiar with information theory will understand the seemingly impossible task that the above mission statement implies. With that in mind, the project is driven by that desire to push against immovable obstacles in the hope that what lies behind will advance our understanding and/or bring to light new understandings about the nature of how systems progress.

What is Toridion?

Toridion is a "special purpose" Quantum Simulation which represents a new paradigm of memory storage.  Toridion's memory storage is a highly sophisticated blend of inaccurate (probabilistic) memory nodes and a continuous virtual visual/data cortex which is perpetually trying to simultaneously solve quantum problems in its own memory whilst at the same time writing new problems and solutions back to the same or new neural pathways.  The result is a very powerful memory architecture that has the ability to create "programs" by "experience design", and these programs can be levered to accelerate some specific tasks such as data access and data recognition by many orders of magnitude.  Toridion represents one of the worlds first "self alert memory systems" (which means it can actually recognise the fact that it 'recognises' something without having an actual memory of the thing it 'recognises') and it is hoped that further research will identify ways in which Toridion can solve very useful problems.

What Toridion is not

Toridion is not a “Theory of Everything” nor is it a “infinite compression” magic bullet.

Why is a new model required, and why “quantum” ?

In the pursuit of its aim, the Toridion Project recognises the inevitable role that quantum computing will unquestionably play in the coming years. Anyone who follows current developments in the quantum channel will already be aware that at the current level of development, most quantum computing is at best experimental and often ‘theoretical’. The essence of Toridion is exploration, and for that matter – “exploration that has to answer to no one, least of all academic or corporate pressures”. There is a saying here “If we all return to the same drawing board, then how will we ever progress”. Taking that a step further, who says we even need a drawing board? This is a project of idealistic, rational, standard and non standard scientific exploration that is difficult to quantify as one science or another. Sometimes it is to “unruly” for mathematics, or to “unreliable or chaotic” for physics.  Quantum was chosen firstly because the model relates to particle theory at minute scale and ridiculous speeds, secondly because one of the ultimate aims of the Toridion Project is to create useful “real world” applications that can bridge the gap between quantum computers and the rest of us. Faster data transfers and reduced data footprints are pivotal in reducing the currently exponential rise in energy consumption from the IoT, cloud storage, streaming and always on technology. If the Toridion Project contributes 1% to achieving global energy reduction then, we for one think that the juice is worth the squeeze.

 

What problem is Toridion trying to solve with virtual particles?

The Toridion Project is focussed primarily on the development of compression software. For the information theorists amongst us, as eluded to in the project description, the limitations of arbitrary compression based on mathematical models that do not rely on some scheme of pattern recognition are well understood. For those with limited knowledge of information theory the following example is a simple description of the problems data science and information scientists face :-

“It is simply not possible to store all permutations of a large value with a smaller value”

The previous statement is proved simply thus:-  Taking the simple sequence AA,AB,BA,BB it can be easily demonstrated that all permutations (4) cannot be represented by a single value ‘A’ or ‘B’ if A=AA and B=BB then AB and BA are ambiguous as to be impossible to represent.

Taking this a little further, the sequence AA,AA,BB,AA,AA,AA,AA, could be reduced to A,A,B,A,A,A. That would be a neat 50% reduction is space used, however at this scale it is easy to arrive at the simple conclusion:

“you can compress all of the data some of the time and some of the data all of the time but not all of the data all of the time”

The above is true for values at any scale.

 

How then is Toridion Project hoping to meet the expectations of its project definition?

Accepting the well understood principle described above and the limitations it implies, Toridion was born out of utter stubbornness and refusal to accept that such a scheme cannot exist. If that means the creation of a new model, an extension of the standard model of particle physics or just plain “bending the rules” then we have a duty to explore those possibilities. In its current state of development Toridion is still a primitive model for a “theoretical particle” that has been designed with inherent behavioural characteristics that make it sympathetic to projects needs. In particular the particle exhibits “Radioactive” characteristics, i.e. exponential decay as well as both ‘linear’ and ‘erratic’ (it is to to early to quantify any chaotic nature in the model) acceleration that theoretically moves beyond light speed.

 

What has Toridion Project developed to date?

On the one hand, in the 6 years since development began Toridion has created as many questions as answers, however there has been some exciting progress. Most notably:-

1: In order to deal with large integer and floating point numbers, a bespoke math library was developed. “lib_toridion” is a suite of functions that handle both integer calculations and floating point operations to many many thousands of decimal places. There are also a number of Toridion specific functions that operate on mixed type BCD and Decimal base 10 numbers simultaneously.

2: The defining of a useful mathematical approximation function nick named “binary quantum rounding”. BQR is a technique by which it can be shown (at least in terms of the model) that a particle can exist in multiple states at the same time and yet also remain ‘different’ or perhaps for the want of a better description – ‘invisible or dark’. The use of BQR is a form of numerical manipulation that can approximate the state of a particle to within <1% 128 bit accuracy with very little information about the initial state of the system and is currently considered critical to the ultimate success of the project.

3: On paired systems, Toridion’s ability to compress and decompress arbitrary data has been extensively shown. During tests in 2014 lossless compression rates of 90%+ were routinely achieved using the well known “Canterbury Corpus” data files.

4: In December 2014 developers working on Toridion successfully transmitted the first ‘non paired’ compressed data across a public/private network. Whilst the data compression achieved was only 50%, the test was a milestone in the development as until this time compression and extraction had been confined to paired systems.

5: April 2015 prototype Solid Matter Quantum Hard Drive. SMQHD is a prototype secure storage media that requires no electrical power

6: December 2015 Researching the potential of Thought Attached Memory – TAM. Building on research in probabilistic memory development Toridion Project will focus on further research of TAM technology

7: February 2016 Aim for commercially viable quantum data teleportation platform in 2 years

8: July 2016 Toridion based quantum gravity models predict 81% of El Nino events since 1933 

 

What’s the next challenge for Toridion Project?

Whilst in part Toridion currently suffers from the same fundamental limitations for data ambiguity (i.e. the same value sometimes represents more than one solution), it also exhibits some inherent characteristics that may contain solutions to the problem. One being that, unlike the AA AB BA BB puzzle, when scaled to the 128 bit levels that Toridion functions at, whilst a great many states are indeed non unique – there are also a great number that are. It further appears there is a seemingly predictable pattern emerging that shows that these ambiguities are ‘scattered’ in clumps.

Secondly, the Toridion encoding formulas exhibit an almost “time linear complexity” (meaning that across the range the processing time to encode a Toridion byte “TiB” is almost identical). The time difference is very small indeed, and this along with further manipulations of the model, for example – rotation during encoding, provide possible sources of predictable information by which to differentiate ambiguous data.

Thirdly, time and dimension are perhaps the most interesting avenues to be explored. Whilst the mathematical mechanics of storing data in ‘near time linear’ fixed size Toridion Byte containers is already in the advanced stages of practical development and use, the inclusion of T&D to the formula is currently seen as the the most likely approach that will allow software to decode the ambiguous parts. In theory this involves the use of a BQR value to describe a near miss for a particular Toridion ‘quantum state’. Using this already predictable value further values are extrapolated independently from the stored data that are then used to identify one or more inputs to the decode formula. The particular scheme for selecting the dimension and time related inputs are pre computed prior to compression according to the entire data value and values sampled from the data set at discreet intervals. These are used to then manipulate the actual encoding phase to essentially “lock in” a signature. If the signature can be represented as a function of the encoded data and the BQR value, then the predicted signature can be computed from minimal information and subsequently used to select the correct dimension that the particle occupies at any single discreet time period. Hence it should be possible for a computer with sufficient resources to determine the correct path to select during decoding.

 

How might Toridion be applied to solve problems in contemporary science?

An obvious first choice is in space exploration. As more transducers and scientific experiments are deployed to space crafts and rovers, so to the amount of data being collect increases. Acceptable data speeds across millions of Kilometres has to be powered. The power required gathered from the Sun or onboard from power cells. The longer the transmission times and the further afield the probe, the more power required. In some situations time is critical, (as observed with the recent comet landing). Not having enough power to recover the mission data could be a primary cause for mission failure, even if the data was already captured but unable to be transmitted before power is lost. Toridion already offers a practical solution to encoding large data sets into greatly reduced formats that are by design a much easier proposition for quantum computers to decode.

Where can I keep up-to date on Toridion Project developments?

You can follow the Toridion project on Twitter @toridionite or if you have specific questions or would like to get involved then contact the project directly by the contact page on this website. You might also like to subscribe to my newsletter for regular updates from Toridion and other AI developments.














  



Website powered by Firecart X eCommerce