• Welcome to the new COTI server. We've moved the Citizens to a new server. Please let us know in the COTI Website issue forum if you find any problems.

Advanced bioengineering in the far future

Carlobrand

SOC-14 1K
Marquis
"I'm fixing a hole where the rain gets in and stops my mind from wandering where it will go."

So, I'm letting my mind wander as I drive, and I'm thinking about vision tech. We have tech that lets us see deep below red, down into the useful frequencies of heat. We have tech that lets us see way past purple into the x-ray - though it's big and bulky. We have tech to "see" using sound and radio. In each case, the limit is the human brain: the tech has to translate the view into something the human eyes can see. Ergo, we can't see deep infrared AND visible light AND sound AND ... because it all needs to be translated into the rather narrow bandwidth our brains are equipped to handle through our eyes.

It doesn't have to be. At TL11, it is possible to have "artificial eyes" and "nerve refusion" (MT Referee's Companion). Conceivably, artificial eyes could be designed to extend the range of vision, permitting hawk-like ability to focus at a distance or improving nightvision to cat-like effectiveness, but extending up into the ultraviolet where birds see or down a bit into the near infrared might require improvements in the brain itself. At TL14, memory erasure hints at the ability to fiddle around with the fine structure of the brain, and genetic engineering is available. At some point, it becomes possible to genetically engineer a person with a brain capable of visualizing a much wider spectrum from birth, a person who could be equipped with prosthetic sensors allowing him/her to simultaneously visualize the deep infrared of the IR devices and the visible light spectrum, perhaps even to simultaneously integrate other sensor modalities - radar, x-ray - as different "colors" when hooked up to prosthetics capable of delivering that information.

Now:

1. What does canon say about bioengineering people to incorporate technological advancements at the personal level? Does it occur? Are there prohibitions? Other than the apparent phobic response that appears to occur at some point after the civil war sets in, are there any social repercussions to those people bred with such capabilities?

2. What other bioengineered tech incorporations are possible, and what are the social and practical consequences? Genegineering people to be able to integrate data flow from an artificial intelligence or other computing device, for example? Superhero comics offer a wide variety of options, but how would these genegeneered prosthetically enhanced people originate, what limits would Imperial government place on their creation and their activities, how would society respond to them? Is it "slavery" if a person genegeneered to use a prosthetic enhancement is also genegeneered from birth to possess fanatic loyalty, or if he's free to leave but the prosthetic has to stay, or if the only place people will tolerate him is within his employer's compound?

3. What adaptations might be available to players without throwing the game out of whack? Built-in jacks to plug into and understand computers? "Geordi"-style multifrequency visors? Microprocessor brain implants functioning as calculators or microencyclopedias? What could be added to an adult brain, and what would need to be engineered into the developing brain during gestation to be possible?
 
Not all people are the same. Some can see at long distance better than others; some needing glasses. Some have better night vision than others. Some are more sensitive to pain than others. Some have more sensitive hearing or taste.

So some people can see, hear, smell, taste and feel in different ranges. Is this always the sensory organ itself, or could it be limitations or enhanced abilities of the brain?

Perhaps at a young age some people do "see ghosts" or "hear monsters" but as they they grow older and are continuously told it's nothing their brain learns to ignore this input. Perhaps some people that are considered crazy/abnormal are just sensing things outside of the normal range but without training "the sky is blue, the grass is green", "the cow goes moo, the duck goes quack", their brain doesn't know how to process the beyond normal sensory input "heat waves at a certain temperature look like", "that type of chemical reaction makes the air nearby smell or taste like"...

My point is that I give the brain a little more credit in that if extra sensory devices are added, with proper training some people could recognize a wider range of input without bio-genetic re-engineering the brain.
 
Last edited:
I believe here you as the GM have to decide for yourself what game balance is and determine what you will or will not allow in your game. In my game I allow replacement body parts ...aka... bionics. Now they have benefits but they also have built in negatives as well. It is all for possible plot devices and better game play.
 
Our eyes are limited by the size of our brains and head. An awful lot of our brain's resources have to be used to process the visual signal. If our eyes absorb more wavelengths of light - even in the visible range we can not perceive as many wavelengths as some creatures - we would need a larger brain and a larger head to contain that larger brain.

Hence mechanical augmentation becomes a better option than bioengineering.

At high TLs I'm sure it would be possible to gene engineer a much more versatile eye structure, not to mention re-routing the optic nerve to shorten it and put the blood vessels outside rather than inside the retina. UV and near IR could be built in, as well as increasing the visible wavelength perception.
Such a change would require a much larger brain, so a bigger body to maintain symmetry or a abnormally large head would be the tell tale signs of such bio-engineering.
 
Our eyes are limited by the size of our brains and head. An awful lot of our brain's resources have to be used to process the visual signal. If our eyes absorb more wavelengths of light - even in the visible range we can not perceive as many wavelengths as some creatures - we would need a larger brain and a larger head to contain that larger brain. ...

In human males, brain volume can range from 1050 to 1500 cc, averaging about 1260 to 1270, depending on source. The most significant variable correlating with brain size is body size, but there is still a good deal of variation among individuals irrespective of body size. The brain proper occupies about 80% of the skull volume, the remainder about evenly divided between blood pumping through the brain and cerebrospinal fluid enveloping and acting as a cushion for the brain.

The visual cortex is located at the back of the brain; it's about 30% of the cortex, but the cortex as a whole is about a quarter of the overall brain, meaning the visual cortex is around 7-8% of the overall brain. Additionally, there are other structures associated with vision, such as the lateral geniculate nucleus - all told, maybe 10% of the brain is directly involved in vision. Visual sight includes not only color but also such things as pattern recognition and spatial organization, and some elements feed directly to "lower" centers of the brain, which is why when you see something startling your body reacts a fraction of a second before your conscious brain registers the view.

Given that we're dealing with about 10% of the brain to begin with, that we're only talking about adding tissue to increase the visual spectrum, and that there is already a good deal of brain-size/head-size variation among humans, I really don't thing any increase in volume is going to be noticeable to the outside observer. I suspect the equipment attached to the head will attract notice long before anyone starts thinking about head size.

This doesn't seem to be attracting much interest. I suspect most gamers deal purely with the hardware and don't explore the biological or sociological aspects much. Visual augmentation is just one angle; the realm of savants suggests the brain is potentially capable of some truly remarkable feats, and warmaking invites a lot of potential inducement for changes such as bioengineered strength or agility. Physical and cognitive talents are not the only option: a despot could commission a generation of troops bioengineered for fanatical loyalty and berserk ferocity on top of whatever physical traits are desired.

A sufficiently advanced medical technology will find ways to induce development of some of those talents. The questions then becomes why it does that, what happens with those people it creates, how the hoi polloi respond to the potential for such creation and how they behave towards the people coming out of such creation. There's a lot of untapped sci-fi territory in fiddling around with human biological potential.
 
By what measurement or standard are "percent of the brain being used" based on?
If it is electrical activity what is the basis of the 100%? What is not being used?
 
The 'humans only use 10% of their brains' idea has been proven to be a fallacy.

I saw this on Discovery Science channel.
 
Conceivably, artificial eyes could be designed to extend the range of vision, permitting hawk-like ability to focus at a distance or improving nightvision to cat-like effectiveness, but extending up into the ultraviolet where birds see or down a bit into the near infrared might require improvements in the brain itself.

Not sure about that. The human eye doesn't transmit the actual images. The "pre-processing" could be done by the artificial eye itself. The person would have to be familiarized with what new different color combos represented though.

http://www.thecrimson.com/article/1922/4/28/measure-optic-nerve-impulses-accurately-pthe/
 
I was basing my earlier comments on this study:
http://rspb.royalsocietypublishing.org/content/280/1758/20130168

The hypothesis is that the larger eyes of the neanderthal required more of their brain to process visual signals and thus less brain is available for reasoning.


Hmm. See latest genetic research that shows which humans have neanderthal DNA and which don't. Also, art discovered as done by them. The evidence of this disputes what is conjectured in your link...
 
The 'humans only use 10% of their brains' idea has been proven to be a fallacy.

I saw this on Discovery Science channel.
While you are correct that 'humans only use 10% of their brains' is a fallacy (it is roughly analogous to saying that a car only uses 25% of its engine since at any given time only about 1/4 of the cylinders are firing) in this case the 10% of the brain he was referring to was the amount of the brain set aside for visual processing.
 
Not sure about that. The human eye doesn't transmit the actual images. The "pre-processing" could be done by the artificial eye itself. The person would have to be familiarized with what new different color combos represented though.

http://www.thecrimson.com/article/1922/4/28/measure-optic-nerve-impulses-accurately-pthe/

Of course in this case the pre-processing would carry its own penalties as well. Assuming you aren't increasing the bandwidth of information to the brain (the reason you are pre-processing) then a human who could see in a wider spectrum would have a reduced ability to resolve colors in the normal spectrum which might give them penalties in some situations.

Similarly a person who could see things further away would probably have a reduction to their field of vision compensatory to the distance they could see (i.e. if they could see 3 times further away then their field of view would be reduced to 1/3 of the norm).

In the case of cybernetic eyes where functions could be turned on and off on command this probably wouldn't be a significant problem but with biologically modified eyes it seems like the drawbacks would be fairly constant.

I suppose at high enough tech levels it might be possible to engineer cells which can be chemically activated and deactivated enabling the creation of a person who could 'activate' additional visual abilities via a pill, shot, or even a cybernetic implant that contains a supply of the necessary chemicals. There would probably be a slight delay in activation however dependent upon the method (pills would probably take the longest, though using transmission methods similar to how a neurotoxin works they could probably still be very quick indeed).
 
By what measurement or standard are "percent of the brain being used" based on?
If it is electrical activity what is the basis of the 100%? What is not being used?

The percent is based on number of neurons identified as involved in the activity. Not surprisingly, there are quite a few different numbers floating about. By this site, there are an average hundred billion neurons in the brain, of which 22.8 billion are in the neocortex.

http://faculty.washington.edu/chudler/facts.html

The primary visual cortex (V1, area 17) has ~538 million cells, but there are other parts involved in vision as well (V2 through V5).

http://en.wikipedia.org/wiki/Visual_cortex

Different sources phrase things differently. Some say that 30% of the cortex is involved in vision. Others say 50% of "pathways" are involved in vision. I suspect differences relate to where you're drawing boundaries: your memory, for example, would have a lot of "pathways" linked to your visual centers both in the course of storing visual memories and in the course of recalling them, but memory is not vision.

Now, how they arrive at counts of neurons - best guess is they count numbers in a small volume and then multiply up, but I don't know for sure. With MRIs and such, they can look at your brain as it receives sensory stimulus or controls your body, and they can see what parts of the brain are "lighting up" while you're doing it, so they've got a decent picture of the brain's gross structures and how they interact. Fine details are still difficult; when we get ourselves an MRI "microscope" that can observe all the individual cells in real time, it's going to get really interesting.
 
Of course in this case the pre-processing would carry its own penalties as well. Assuming you aren't increasing the bandwidth of information to the brain (the reason you are pre-processing) then a human who could see in a wider spectrum would have a reduced ability to resolve colors in the normal spectrum which might give them penalties in some situations.

Agreed. It would be a trade off doing it like this. Cybernetic would make more sense at higher TL's.
 
While you are correct that 'humans only use 10% of their brains' is a fallacy (it is roughly analogous to saying that a car only uses 25% of its engine since at any given time only about 1/4 of the cylinders are firing) in this case the 10% of the brain he was referring to was the amount of the brain set aside for visual processing.

As Carlobrand says, it is more than that for just visual as well.

There is a program where a scientist is doing research on a mind reading computer. It guesses, based on the subject's thoughts as the person looks at a photo, what the person is looking at. Discovery science channel.

Some were very close.

But something the researcher didn't mention was: how many different pictures do the subjects have to look at ? Is the picture pool small ? Or in the hundreds of thousands of photos ?

First instance, not much of a challenge. The computer just compares the small pool of photos in its database to compare to what the subjects see. Should be much easier for a find to happen.
 
Back
Top