Optical computer 3

Ok, this one is a little odd, but it is a dynamic computer with no electrical wires, (outside of individual devices)

O.l.e.d. = output On off or R on/ off G on/off B on/off or Floating point-(capacitive pulses @ rate)

Antenna and logic gate, powers antenna if gate is open,
internal CPU

Microwave laser powers antennas—Edit: the smaller wavelength is the better for density sake,

RGB laser sends data, in

then another system watches with photo sensors based on what it supposed to be monitoring and it can shift “focus” to tasks, and leave others rolling, while not observed,

More then 1 Input channel = mulitple RGB lasers (input threads)

more then 1 output channel = RGB sensors (SENSITIVE RGB SENSORS !! )

So with effort, a completely expandable computer can be made, that can be completely parallel, however,
the cpu’s are actually Pcpu’s (pixel Cpu’s) :slight_smile: so each “Pcpu” can be a logic formula, or a small script, or a ??

Help me solidify this…

What math do you plan on using for the instructions? What would the bit look like?

I am not sure,
I am not the best software person :slight_smile:

I do know that a static connection is the issue, so no static connections,
totally reprogrammable,

I don’t really know what would be the most efficient “base” to run

Computers are Turing complete finite state machines.
How will your dream machine store a state?
Will you fill light in small buckets? :slight_smile:

The “top level” that controls the data inputs and outputs can store data, as well as each “Pcpu” has it’s own resources,
and a “PRam” could also be made,

I have been thinking about it, each “packet” would be color Vector - Address , amplitude float - data type, and pulse rate i/o = binary?

the idea is that millions of “slaves” listen to a “host” directs to them, and do what he says, always broadcasting there results,

QDot tech has made optical sensors that are 10 times more accurate,

(also another thing to note, half of the pc could be powered by the other half’s heat, using Qdot and thermo crystal tech)

Well photons can change the energy state of electrons and move them to higher shelves. There is the fact that such interactions are probabilistic to consider. I can’t think of a single reason to expect classical behavior. Maybe controlling the density of the signal could promote predictability. There’s also the issue of volatility to consider. The electrons can give off photons but then their energy state would change and they move to lower shelves. I would assume that storage would need to be maintained in some way. It’s an interesting idea.

Disclaimer: I am not a scientist…“grain of salt”.

http://researcher.ibm.com/researcher/view_project.php?id=2757

:smiley:

on December 10, 2012 IBM announced a breakthrough optical communication technology which has been verified in a manufacturing environment. The technology – called “silicon nanophotonics” – uses light instead of electrical signals to transfer information for future computing systems, thus allowing large volumes of data to be moved fast between computer chips in servers, large data-centers, and supercomputers via pulses of light.

Interesting proposition to dwelve into. I believe this idea has plenty of merit for further progress.

I already said here one technology that is already being used: immediate transference of data using entangled particles. When you have two “entangled” particles they are the same particle bilocated. Then you place one in a planet and another in another planet, even in another galaxy. And if you magnetically changes one, the other shows same reaction. You can use them to transfer then 1 and 0 as in binary or morse. The transfer of info is instantaneous, it breaks the “light speed limit” for info being transmitted (perhaps the barrier is only real for mass traveling, but I would not bet millions in saying there is no way mass can go faster than light, because teletransportation, but well, let talk about that some years later when more things come out…).

That is already working to “secure communications”. All the encryption thing is just for the masses that don’t have the technology.

Using entangled particles you can forget wires forever. You have a network that you can move from a place to another without need of wires to communicate.

By the way, it is the same as telepathy works in those gifted humans that are able to “control it”. We all have “quantum transmitters” and we don’t need to be cooled to near zero degrees kelvin, so it shows: First, how incredible well designed Life is (animals are also telepathic, for example dogs know and feel their owners thoughts), Second, quantum computers are posible with normal temperatures.

So, about that efficiently absorbing single photons.

sorry. never mind.

Don’t think that they (photons) can be properly absorbed and emited today. Though with the current level of technological progress, this can be done in the near future.

The trick is that the photon must essentially go through conversion to electron, for storage, otherwise is a no go. This will let optic electronics co exist with carbon ones.

Until the next major breakthrough happens.

From the article

he thinking goes like this. Because energy can exist in a superposition of states, it can travel a variety of routes around the network at the same time. And when it finds the correct destination, the superposition collapses, leaving the energy at the reaction centre. The result is an almost perfect transfer of energy.

But Vattay and Kauffman say that this kind of pure quantum process this cannot be responsible either. That’s because a number of quantum processes slow down the movement of quantum objects through random networks like this. “Quantum mechanics has adverse effects too,” they say.

One of these party-poopers is known as Anderson localisation, a phenomenon that prevents the spread of quantum states in random media. Because the quantum state acts like a wave, it is vulnerable to interference effects, which prevent it propagating in a random network.

Another is the quantum zeno effect, the paradoxical phenomenon in which an unstable state never changes if it is watched continuously. That’s because watching involves a serious of measurements that constantly nudge the state, preventing it from collapsing. This is the quantum version of the watched-pot-never-boils effect.

A similar thing happens to the quantum state of the energy during light harvesting. This quantum state will inevitably interact with the environment but these interactions act like measurements. This triggers a quantum zeno-like effect that prevents the state from collapsing at the reaction centre. So the energy transfer cannot occur in this way, say Vattay and Kauffman.

Instead, they propose a new process in which the quantum search mechanism and the interaction with the environment combine to overcome Anderson localisation. It is the interplay between these processes that delivers the energy to the reaction centre in an optimal way, they say.

If you can absorb a single photon and , and emit a single photon, you have a system that can transport data. The system here is almost as efficient as it can be.

ahem…

Dood… share that on G+.

I was wondering if it would become possible to utilize that 3rd dimension that’s so unused in computing right now.

If we could start having this technology in CPU’s and GPU’s, that would mean the end of the massive GPU brick design for one thing.

80 times less power!? The EPA is going to have to create a platinum tier for their Energy Star label if it uses that small of an amount.

http://www.forbes.com/sites/alexknapp/2014/08/07/ibm-builds-a-scalable-computer-chip-inspired-by-the-brain/

This is maturing nicely

and so is HP, - http://www.gopubliccounsel.org/?p=1278

There are lots of ways to make optical switches. The mechanisms I researched once upon a time (20 years ago) relied on optical gratings. Most people are familiar with the way different colors of light exit a prism at different angles, but a diffraction grating can also give this effect. When a laser beam of a single wavelength falls on a grating, the spacing of the grating interacting with the wavelength of the light causes the light to diffract off at a known angle.

Spectrometers that measure the spectrum of light sources such as stars have a precisely etched grating on a hard surface, but there are other ways to make gratings. For example, acoustic switches can set up a standing wave on an optical element and quickly turn on and off, but the time for the switch to change states isn’t any faster than the speed of sound in the material.

To get a faster operating switch, an electric field can be applied to some materials, and a grating will appear in the material (google Kerr Effect). A real application for this kind of switch is in “Q-switched” lasers that build up large pulses.

Another technique that might hold promise for optical computers would be to actually create the grating from the laser light beams themselves. When two beams cross in a crystal, the electric field of the light itself can interact with the matter to create a periodic pattern, a grating. A third beam will either pass through or deflect depending on the presence or absence of the first two beams. We tested several materials measuring state change, switch life, etc. Some crystals actually have a significant time before the field relaxes, which could serve as kind of “memory,” and the data density of storing 3 dimensional holograms in even a 1 cm^3 crystal is immense.

I haven’t revisited the topic but a few times in the last two decades, but all of the elements for a working, purely optical computer were physically possible (if not practical) for a while now.

here is another company breaching the subject,

Small, eco-friendly optical supercomputers may soon be crunching quadrillions of calculations per second (exaflops) if a company called Optalysys has its way. It claims to be months away from demonstrating a prototype optical computer that will run at 346 gigaflops to start with – not as fast as the best supercomputers, but pretty good for a proof-of-concept. Here’s how it works: low-intensity lasers are beamed through layers of liquid crystal grids, which change the light intensity based on user inputted data. The resulting interference patterns can be used to solve mathematical equations and perform other tasks. By splitting the beam through multiple grids, the system can compute in parallel much more efficiently than standard multi-processing supercomputers (as shown in the charming Heinz Wolff-hosted video below).