Not only control machines through mind, last month Scientists achieved the first remote human-to-human brain interface, when Rajesh Rao sent a brain signal over the Internet that moved the hand of colleague Andrea Stocco—even though Stocco was sitting all the way across the University of Washington's campus.
Using one human brain to direct another person's body via the Internet was an amazing breakthrough. But other feats of mind control are already realities, particularly in the realm of human machine interfaces (HMIs).
The history of brain–computer interfaces (BCIs) starts with Hans Berger's discovery of the electrical activity of the human brain and the development of electroencephalography (EEG). In 1924 Berger was the first to record human brain activity by means of EEG. Berger was able to identify oscillatory activity, such as Berger's wave or the alpha wave (8–13 Hz), by analyzing EEG traces.
Berger's first recording device was very rudimentary. He inserted silver wires under the scalps of his patients. These were later replaced by silver foils attached to the patients' head by rubber bandages. Berger connected these sensors to a Lippmann capillary electrometer, with disappointing results. However, more sophisticated measuring devices, such as the Siemens double-coil recording galvanometer, which displayed electric voltages as small as one ten thousandth of a volt, led to success. Berger analyzed the interrelation of alternations in his EEG wave diagrams with brain diseases. EEGs permitted completely new possibilities for the research of human brain activities.
Professor Jacques Vidal coined the term "BCI" and produced the first peer-reviewed publications on this topic. Vidal is widely recognized as the inventor of BCIs in the BCI community.
Vidal's first BCI relied on visual evoked potentials to allow users to control cursor direction, and visual evoked potentials are still widely used in BCIs.
Our brain is a very complex thing. Brain enables the body to behave the way it does by sending electrical signals to different parts of the body.
Neuroprosthetics is an area of neuroscience concerned with neural prostheses. That is, using artificial devices to replace the function of impaired nervous systems and brain related problems, or of sensory organs. The most widely used neuroprosthetic device is the cochlear implant which, as of December 2010, had been implanted in approximately 220,000 people worldwide. There are also several neuroprosthetic devices that aim to restore vision, including retinal implants.
The difference between BCIs and neuroprosthetics is mostly in how the terms are used: neuroprosthetics typically connect the nervous system to a device, whereas BCIs usually connect the brain (or nervous system) with a computer system. Practical neuroprosthetics can be linked to any part of the nervous system—for example, peripheral nerves—while the term "BCI" usually designates a narrower class of systems which interface with the central nervous system.
The terms are sometimes, however, used interchangeably. Neuroprosthetics and BCIs seek to achieve the same aims, such as restoring sight, hearing, movement, ability to communicate, and even cognitive function. Both use similar experimental methods and surgical techniques.
Arduino is the software used to process EEG with IR sensor. Arduino is an open-source prototyping platform based on easy-to-use hardware and software. Arduino boards are able to read inputs - light on a sensor, a finger on a button, or a Twitter message - and turn it into an output - activating a motor, turning on an LED, publishing something online. All this is defined by a set of instructions programmed through.
The mind stimulates everything that is quite usual. It explores thing at a rapid speed, manipulating a lot of data. The computer can be shut down with the help of the mind. Of course we use our hands to approach the button to shutdown. But with the new rapid development we can easily put the effort by using only our brain without exercising the hands. It is quite obvious that our hands are used in different applications but with this process we think and then the smartphone, smart home, computers etc can function accordingly. How does all the neuro-magic happen? At the heart of EEG project is a retired Thinkgear ASIC PC board by Neurosky. It comes loaded with fancy algorithms which amplify and process the different types of noise coming from the surface of our brain. A few small electrodes made from sheets of copper and placed in contact with the forehead are responsible for picking up this noise. The bridge between the electrodes and the Thinkgear is an arduino running the illumino project code. For tutorial, a Tinylilly Arduino is used to mesh with the wearable medium, since all of these parts are concealed in the folded brim of the beanie. The mind can control easily by using EEG headset. IBM is working on these hardware.
Future is very close
Using one human brain to direct another person's body via the Internet is an amazing breakthrough. But other feats of mind control are already realities, particularly in the realm of human machine interfaces (HMIs). Here are some amazing examples of what our brains can already do.
Compose and Play Music- Now musicians might be able to eliminate the need for tools and interfaces like sheet music—or even playing an instrument—by simply creating music directly with their thoughts.
Screen Mobile Phone Calls- Like a tough personal secretary, Ruggero Scorcioni's Good Times app filters the incoming calls of busy mobile phone users by simply monitoring the state of the user's brain.
Create a 3-D Object- A Chilean company has announced the first object to be created by thought alone—paired with the growing power of the latest 3-D printing machines.
Drive a Wheelchair— And a Car- For the disabled, the ability to move about using the power of their minds could be life changing. To that end, scientists have worked for years on wheelchairs and other devices that could restore mobility to those who had lost control of their own bodies but still had sharp minds.
At Lausanne, Switzerland's Federal Institute of Technology, scientists have added "shared control" to the concept. Their chair's software analyzes the surrounding area's cluttered environment and blends that information with the driver's brain commands to avoid problems like collisions with objects.
The system also eases the strain of command because users needn't continually instruct the chair—the software processes a single directional command and automatically repeats it as often as needed to navigate the space.
"Bionic" Limbs- In some instances, human machine interfaces are becoming part of the human body. One new prosthetic even provides a sense of "touch" like that of a natural arm, because it interfaces with the wearer's neural system by splicing to residual nerves in the partial limb.
The future for brain control is unlimited, limited only by the limit of the human mind. Internet of Things and Brian Control are converging soon providing us endless opportunities.