Sensebridge: Difference between revisions

From Noisebridge
Jump to navigation Jump to search
(adding neuro category)
(45 intermediate revisions by 14 users not shown)
Line 1: Line 1:
=Sensebridge: The Noisebridge Cyborg Group=
=Sensebridge: The Noisebridge Cyborg Group=
[[Image:CoinTypePagerMotors.jpg|right|350px|title|Coin Type pager Motor]]


"If you can't beat the robot armies, join them."
"If you can't beat the robot armies, join them."


We are concerned with human-machine interfaces.
We are concerned with human-machine interfaces.
==Motivations==
We want to "make the invisible visible", to bridge our senses.  Many group members were inspired years ago by this [http://www.wired.com/wired/archive/15.04/esp.html awesome Wired article], which describes obtaining an <i>unerring sense of direction</i> via a compass belt.  Is the brain really plastic enough to adapt to entirely new senses?  How natural would it feel after you've fully adapted?  Then what happens when you take the device off?
What new sense do you want?  What existing sense might it map onto?


==Terminology==
==Terminology==
Line 9: Line 17:
To avoid confusion (where possible) we'll try to use these terms.
To avoid confusion (where possible) we'll try to use these terms.


"Sensor" is the device that detects external stuff that the body (probably) can't detect.  "Display" is the way the device shows data to the body (not necessarily visually).  This gets around the device-centric / body-centric problem that shows up when using the terms input & output.  "Armature" is the physical component of the device: the belt, hat, armband, etc; distinct from the electronics.
"Sensor" is the device that detects external stuff that the body (probably) can't detect.  Sensors can be divided into those that sense the world, and those that digitize a user's actions.  The latter are more commonly called "human input devices", (HID's).
 
"Display" is the way the device shows data to the body (not necessarily visually).  This gets around the device-centric / body-centric problem that shows up when using the terms input & output.   
 
"Armature" is the physical component of the device: the belt, hat, armband, etc; distinct from the electronics.


==Sensors (Input to the body)==
==Mailing List==


Our first task will be to experiement with data input modes. What signals can we overlay upon the body's standard senses, and by what means? Of particular interest are electro- and vibro-tactile stimulation. What bandwiths are available according to the nature and location of input stimuli?
We have created a [https://www.noisebridge.net/mailman/listinfo/cyborg cyborg mailing list] on the Noisebridge server. Please join us!


Secondly, we need to identify useful measurements to input. Sensory augmentation has been around for centuries, think of magnifying glasses or hearing aids. Sensory substitution, on the other hand, involves the transposition of one sensory stimulus into another: seeing sound, hearing distance, feeling vision, etc. Sensory extension is measuring signals beyond regular human abilities, such as magnetometry, radar, echo location or even digital signals coming from a wireless network carrying such diverse information as locations of friends in crowds or network activity on one's website.
==Component Sourcing==


===Component Sourcing===
Here are pages for discussing:


Here are pages for discussing [[Sensors|sensors]], [[Displays|displays]], and [[Armatures|armatures]] in terms of price and usability.
===[[Sensors|sensors]]===
detect a property of the world


We are ordering some RBBBs. Go here [[RBBB ORDER SIGN UP]]!!!1!one!
===[[Displays|displays]]===
communicate it to the body/mind
===[[Armatures|armatures]]===
hold the above together and in the proper place on the body


===Stimulus Sensor and Input Method Common API===
They also include a list of all known/speculated types of each.
 
==Sensor and Display API==


*What might that mean?
*What might that mean?
Line 31: Line 49:


*Why?
*Why?
**Plug-and-play(ish) interoperability of sensors and input modes. Build three sensors and three input modes, and we have 9 possible combinations, und so weiter.
**Plug-and-play(ish) interoperability of sensors and displays. Build three sensors and three displays, and we have 9 possible combinations, und so weiter.
**Simplify development by separating sensing and presentation design/tasks.
**Simplify development by separating sensing and presentation design/tasks.


Line 39: Line 57:
**#Abstract away from the physical phenomena the stimuli represent and look instead at common patterns in the data streams.
**#Abstract away from the physical phenomena the stimuli represent and look instead at common patterns in the data streams.
**#???
**#???
**On the data presentation end:
**On the data display end:
**#Identify and prototype a number of input methods.
**#Identify and prototype a number of display methods.
**#For each method think of:
**#For each method think of:
**#*What modulations of the signal are possible; intensity, spacing, direction, timing...? Based on that, what is the theoretical bandwidth of that input method?
**#*What modulations of the signal are possible; intensity, spacing, direction, timing...? Based on that, what is the theoretical bandwidth of that method?
**#*How sensitive is the average person there? How much of the signal can the brain interpret meaningfully (too much noise vs. too weak of a signal)? Based on that, what is the <em>practical</em> bandwidth that we might expect?
**#*How sensitive is the average person there? How much of the signal can the brain interpret meaningfully (too much noise vs. too weak of a signal)? Based on that, what is the <em>practical</em> bandwidth that we might expect?


Line 48: Line 66:
[[Compass Vibro Anklet|Compass Vibro-Anklet]]
[[Compass Vibro Anklet|Compass Vibro-Anklet]]


[[Eyes in the back of your back]]
[[Tongue Tingler|Tongue Tingler (Trans-lingual Electric Nerve Stimulator)]]
[[Free hands input device]]
[[Pulse Necklace]]
[[QuickLook Displays]]
[[UltraSonicHearing|Ultrasonic Hearing / Echolocation]]
[[TechnicolorDreamcoat|Technicolor Dreamcoat]]


==Cybernetic Theory==


Cybernetic loop of GPS + vibrating hat, [https://www.noisebridge.net/images/d/d7/Find_Treasure_%28GPS%29.jpg diagram].


Pseudo-cybernetic loop showing neural integration of artificial sensory system, [https://www.noisebridge.net/images/8/8f/Sensory_Addition_%281.5_order%29.jpg diagram].
[[List of senses we do not possess]]


===Miscellaneous Distractions===


==Brain/Body Output==
[http://pony.noisebridge.net/~cmaier/recherche_du_temps_perdu/ Edible interconnects]


The study of output is a question in how we can interface with machines beyond the manual button pushing and lever pulling. Some ideas in this field are biomechanical feedback, pulse detection, eye-tracking, etc.
==Cybernetic Theory==


Cybernetic loop of GPS + vibrating hat, [https://www.noisebridge.net/images/d/d7/Find_Treasure_%28GPS%29.jpg diagram].


==Interesting questions==
Pseudo-cybernetic loop showing neural integration of artificial sensory system, [https://www.noisebridge.net/images/8/8f/Sensory_Addition_%281.5_order%29.jpg diagram].
What new sense do you want?  What existing sense might it map onto?


The brain seems to integrate consistent input quite easily. What happens after this integration has happened, when the input stops (i.e., you take off the belt after wearing it for days)?
[http://en.wikipedia.org/wiki/Cybernetic Cybernetics page on Wikipedia]




Line 82: Line 112:


[[Lamont's PWM code]]
[[Lamont's PWM code]]


==Meeting Notes==
==Meeting Notes==
Line 90: Line 121:


[[NBC_2009Mar29|March 29, 2009]]
[[NBC_2009Mar29|March 29, 2009]]
[[NBC_2009Apr05|April 5, 2009]]
[[NBC_2009Apr19|April 19, 2009]]
(the meetings continue apace; the minutes, sadly not)
[[NBC_2009July12|July 12, 2009]]
[[NBC_2009July13|Hack Session July 13, 2009]]
[[NBC_2009July19|July 19, 2009]]
[[NBC_2009August2|August 2, 2009]]
[[NBC_2009October4|Oct 4, 2009]]


[[Category:Sensebridge]]
[[Category:Sensebridge]]
[[Category:Neuro]]

Revision as of 20:18, 11 May 2012

Sensebridge: The Noisebridge Cyborg Group

Coin Type pager Motor

"If you can't beat the robot armies, join them."

We are concerned with human-machine interfaces.


Motivations

We want to "make the invisible visible", to bridge our senses. Many group members were inspired years ago by this awesome Wired article, which describes obtaining an unerring sense of direction via a compass belt. Is the brain really plastic enough to adapt to entirely new senses? How natural would it feel after you've fully adapted? Then what happens when you take the device off?

What new sense do you want? What existing sense might it map onto?

Terminology

To avoid confusion (where possible) we'll try to use these terms.

"Sensor" is the device that detects external stuff that the body (probably) can't detect. Sensors can be divided into those that sense the world, and those that digitize a user's actions. The latter are more commonly called "human input devices", (HID's).

"Display" is the way the device shows data to the body (not necessarily visually). This gets around the device-centric / body-centric problem that shows up when using the terms input & output.

"Armature" is the physical component of the device: the belt, hat, armband, etc; distinct from the electronics.

Mailing List

We have created a cyborg mailing list on the Noisebridge server. Please join us!

Component Sourcing

Here are pages for discussing:

sensors

detect a property of the world

displays

communicate it to the body/mind

armatures

hold the above together and in the proper place on the body

They also include a list of all known/speculated types of each.

Sensor and Display API

  • What might that mean?
    • A standard protocol that can:
      1. encompass most sensory data of the sort we are interested in researching
      2. allow for presentation of any those data through any of a wide variety of modalities
  • Why?
    • Plug-and-play(ish) interoperability of sensors and displays. Build three sensors and three displays, and we have 9 possible combinations, und so weiter.
    • Simplify development by separating sensing and presentation design/tasks.
  • How?
    • On the data sensing end:
      1. Identify a primary set of interesting stimuli, consider various encodings thereof.
      2. Abstract away from the physical phenomena the stimuli represent and look instead at common patterns in the data streams.
      3. ???
    • On the data display end:
      1. Identify and prototype a number of display methods.
      2. For each method think of:
        • What modulations of the signal are possible; intensity, spacing, direction, timing...? Based on that, what is the theoretical bandwidth of that method?
        • How sensitive is the average person there? How much of the signal can the brain interpret meaningfully (too much noise vs. too weak of a signal)? Based on that, what is the practical bandwidth that we might expect?

Current Projects

Compass Vibro-Anklet

Eyes in the back of your back

Tongue Tingler (Trans-lingual Electric Nerve Stimulator)

Free hands input device

Pulse Necklace

QuickLook Displays

Ultrasonic Hearing / Echolocation

Technicolor Dreamcoat


List of senses we do not possess

Miscellaneous Distractions

Edible interconnects

Cybernetic Theory

Cybernetic loop of GPS + vibrating hat, diagram.

Pseudo-cybernetic loop showing neural integration of artificial sensory system, diagram.

Cybernetics page on Wikipedia


Links

Youtube of experimenter stimulating his face with small electrical pulses to produce externally controlled movements: http://www.youtube.com/watch?v=dy8zUHX0iKw

http://transcenmentalism.org/OpenStim/tiki-index.php

http://openeeg.sourceforge.net/doc/

http://www.biotele.com/ - remote control cockroach

http://www.instructables.com/id/Breath-powered-USB-charger/

VibroHat in action at BIL 2008: http://www.flickr.com/photos/quinn/2307779651/in/photostream/

Lamont's PWM code


Meeting Notes

March 15, 2009

March 22, 2009

March 29, 2009

April 5, 2009

April 19, 2009

(the meetings continue apace; the minutes, sadly not)

July 12, 2009

Hack Session July 13, 2009

July 19, 2009

August 2, 2009

Oct 4, 2009