World, Writing, Wealth discussion

17 views
Wealth & Economics > Brain Implants and Remote Control - Zuckerberg Style

Comments Showing 1-50 of 58 (58 new)    post a comment »
« previous 1

message 1: by Graeme (new)

Graeme Rodaughan Facebook founder Mark Zuckerberg is big into neural implants. While ostensibly aimed at creating therapies, the nefarious opportunities of this technology to be used for remote control of humans are clear.

REF: https://www.businessinsider.com.au/ch...

Any thoughts?


message 2: by Joanna (new)

Joanna Elm | 145 comments Because we want Zuckerberg and Facebook to have access to our thoughts so he can sell those to third parties, and the government!


message 3: by Graeme (new)

Graeme Rodaughan I kinda imagine that the 'Borg,' were using this tech.


message 4: by Scout (new)

Scout (goodreadscomscout) | 8073 comments I'm always thinking about the nefarious ways that corporations and government can use my info. No neural implants for me. Sometimes I go for a walk down a country road and leave my phone behind. Does anyone know where I am? I'm hoping not. Just feels good to be free.


message 5: by Matthew (new)

Matthew Williams (houseofwilliams) How are the nefarious ways clear? There is no known means of controlling a person through a neural implant, beyond dystopian science fiction scenarios.


message 6: by Scout (new)

Scout (goodreadscomscout) | 8073 comments So you're up for a neural implant?


message 7: by Matthew (new)

Matthew Williams (houseofwilliams) Scout wrote: "So you're up for a neural implant?"

Personally, not sure. I could foresee all kinds of good applications for them, not the least of which is treating mental illness and people who have suffered from a brain injury. And commercially, they are likely to become the next thing people "can't do without", much like how smartphones are now.

I imagine that before I die, they will become mainstream, and I might cave to pressure because I need help with my memory or some such thing :)


message 8: by Graeme (new)

Graeme Rodaughan Hi Matthew,

From the document.

"One recent project is a wireless brain implant that can record, stimulate, and disrupt the movement of a monkey in real time.

...

The Wand could “sense” when the primate was about to move the joystick and stop that movement with a targeted electric signal sent to the right part of its brain, Muller said. And since the machine was wireless, the monkey didn’t need to be physically confined or attached to anything for it to work.

“This device is game-changing in the sense that you could have a subject that’s completely free-moving and it would autonomously, or automatically, know” when and how to disrupt its movement, said Muller."


Just implant the prisoner population and ensure they can't move out of a location and save money on prison walls.


message 9: by Scout (new)

Scout (goodreadscomscout) | 8073 comments Oh, hell no. I hope this isn't where we're going. Anyone could be labeled a "prisoner" in the future. No. Just no.


message 10: by Ian (new)

Ian Miller | 1857 comments I guess it depends on how it is done. I had them in one of my SF stories, but they were only useful for transferring information, including to and from computers that were tuned for that implant. The concept was that because all brains are wired differently, the implant had to be wired (and not necessarily with wire) to suit the brain and therefore they could not be hacked. Further, it would immediately be alerted if anybody tried to hack, and the master computer (for that person) would be able to do all sorts of things to the hacker. The uses were to directly interact with machines by thought and to manage avatars.

Would I have them? At my age, no.


message 11: by Graeme (last edited Jan 02, 2019 02:12AM) (new)

Graeme Rodaughan Any hacking defense can be overcome.


message 12: by Ian (new)

Ian Miller | 1857 comments Not when you have to have the geometry of the interface to make contact. That depends on the probabilty of knowing the neuron linkage pattern, and the number of permutations and combinations here makes astronomical numbers look small.


message 13: by Matthew (new)

Matthew Williams (houseofwilliams) Graeme wrote: "Hi Matthew,

From the document.

"One recent project is a wireless brain implant that can record, stimulate, and disrupt the movement of a monkey in real time.

...
The Wand could “sense” when the..."


Ah, I see. And yes, if the implants that were placed in a human brain were subject to external control, there could be terrible abuses. Except that any attempt to use them that way would inevitably be discovered and to the full-scale destruction of the company and the implants themselves. It does make for an awesome dystopian sci-fi novel though :)

As for prisoners, that is actually a very likely application. If you recall, I used a similar idea in my first book, the "spike", which arrested aggressive behavior. I had a chapter in which one malfunctions, but the editor told me to cut back and I had to remove it.


message 14: by J.J. (new)

J.J. Mainor | 2440 comments Joanna wrote: "Because we want Zuckerberg and Facebook to have access to our thoughts so he can sell those to third parties, and the government!"

Sadly, most people wouldn't have a problem with that if it were actually implemented...


message 15: by Graeme (new)

Graeme Rodaughan Matthew wrote: "If you recall, I used a similar idea in my first book, the "spike", which arrested aggressive behavior...."

Indeed, I do.

Precisely.


message 16: by Nik (new)

Nik Krasno | 19850 comments As with anything, there is no absolute good or bad. I'm sure, if this thing in the end helps combat epilepsy or spinal cord injuries, those cured would applaud, while those for whom it's elective will have their doubts.
Yes, I guess those who use it would become susceptible to being controlled, but it's probably gonna be a two way street. Imagine being able to control (many) things remotely exclusively by a brain power.
Is a nuclear bomb good or bad?


message 17: by Matthew (new)

Matthew Williams (houseofwilliams) Nik wrote: "As with anything, there is no absolute good or bad. I'm sure, if this thing in the end helps combat epilepsy or spinal cord injuries, those cured would applaud, while those for whom it's elective w..."

Well said. Neural implants are predicted to become the replacement for every digital device we currently use - a smartphone, IM system, GPS and navigation, internet search engine, email, and augmented/virtual reality. It's not a question of whether they are right or wrong, I think, but when they will become a reality. Because, let's face it, there's no preventing new technologies from becoming available unless the harm they do outweighs the good. And how does one prove that?

Is there any such thing as a technology that is not morally ambiguous? I like Nik's example of nuclear bombs. Can anyone think of an argument in which they are no good, and for any reason?


message 18: by Ian (new)

Ian Miller | 1857 comments In my opinion, we would be much better off without nuclear bombs because they have no function other than to kill a massive number of people. Yes, I suppose if we were to be invaded by aliens, he nuke would offer the best change of seriously damaging them, but I don't think that is valid. I know some will maintain that nukes have prevented WW III, on the grounds that each side knows the other side will annihilate them, but I feel that is not enough to qualify for good, because there were at least four times when but for good judgment by one person who refused to accept what he thought he was supposed to do it could have started accidentally, and secondly, while WW III has been averted, it is only up till now. It could still happen.

However, apart from that, I agree mainly with Matthew. It is like Pandora's box - once the technology is discovered it cannot be put back in the box. If it works, it will be used. For those worried by it, you don't have to accept the implant.


message 19: by Matthew (new)

Matthew Williams (houseofwilliams) Ian wrote: "In my opinion, we would be much better off without nuclear bombs because they have no function other than to kill a massive number of people. Yes, I suppose if we were to be invaded by aliens, he n..."

"I know some will maintain that nukes have prevented WW III,"

I am on of those and I was hoping someone would say that, thanks :) The same holds for the alien invaders scenario. I can't personally think of any other argument in defense of nuclear armaments, especially when you consider the cost of maintaining a nuclear arsenal.

There's also the argument that without it's nuclear arsenal, Israel would have been overrun and destroyed by now. Not sure if I'm one of those people, mainly because Israel started testing nukes since the 1960s, during which two wars happened (1967 and 1973) and none thereafter.


message 20: by Graeme (new)

Graeme Rodaughan Matthew wrote: "Neural implants are predicted to become the replacement for every digital device we currently use - a smartphone, IM system, GPS and navigation, internet search engine, email, and augmented/virtual reality.'..."

I have a tendency to agree with you Matt, however i'm deeply suspicious of anyone having direct and potentially irrevocable access to my brain.

On the otherhand, I was asked the question, would I take a NI if it made me smarter? And, what would I do if all my collegues took it too, would I be willing to be left behind?

I didn't have an answer for that.


message 21: by Matthew (last edited Jan 03, 2019 08:19PM) (new)

Matthew Williams (houseofwilliams) Graeme wrote: "Matthew wrote: "Neural implants are predicted to become the replacement for every digital device we currently use - a smartphone, IM system, GPS and navigation, internet search engine, email, and a..."

Yep, that's what we're dealing with, and it what's makes the future scary, isn't it? There are many great unknowns we'll be getting into thanks to the rapidly-advancing pace of technological change. I do think dystopian scenarios are a bit naive, but they are instructional and necessary (and entertaining!) because they get us thinking about the issue.

Speaking of which, anyone here familiar with the technological singularity?


message 22: by Graeme (new)

Graeme Rodaughan Hi Matt, I work in engineering and IT fields. Risk management and product safety are drummed into everything we do... until it becomes second nature.

To me dystopian scenarios amount to 'risk identification,' activities and I think have more value then you imply.

It strikes me that our growing scientific/technical power as a species has multiple pathways forward that end badly for us.

I would include scientific/technical risk of something 'new,' disrupting society to such an extent that it crashes fatally as a real possibility that needs to be identified and mitigated.

On the flip side, where I work does cutting edge technology as well. I'm kinda right in this stuff.


message 23: by Ian (new)

Ian Miller | 1857 comments As far as risks go, my opinion is that the most likely way there can be bad outcomes is the possibility of unforeseen possibilities. The people that develop tech initially seldom see the most of what could happen, and unfortunately the unscrupulous seem to locate these possibilities like moths come to the light.


message 24: by Philip (new)

Philip (phenweb) Ian wrote: "As far as risks go, my opinion is that the most likely way there can be bad outcomes is the possibility of unforeseen possibilities. The people that develop tech initially seldom see the most of wh..."

Take the case of hackers. Tools used to diagnose and fix IT systems were misused to gain unauthorised access or leave backdoors and faults. Now criminals or just the curious can use the same tools and techniques to gain access/steal data etc.

Implants are coming - all the technology paths demonstrate they are not far off. We already have some enablers. Take the humble hearing aid. Useful for the hard of hearing and even the clinically deaf but how does a deaf person know that what is being broadcast to them via the aid is actually what is being said.

If that is a direct brain implant but connected to the outside world then any use, storage, comms, health monitoring could be taken over by malcontents. We have always had misuses of technology in support of greed by criminals or abuses of power - we always will.


message 25: by Ian (new)

Ian Miller | 1857 comments Notwithstanding the potential fears, why cannot such a device be made that is restricted to one-way transmission. Thus I am currently listening to a radio. While radios can be built to receive and transmit, most are simply receivers. Why can't such a neural implant be made "read only"? There is a major difference physically in scanning for electromagnetic changes and imposing them. Surely devices can be built that cannot change the chemical flows in the brain, and indeed it would seem to be a major difficulty in working out how to do that. The brain works chemically at the base level, so how do you send certain chemicals to form some sort of matrix effect when they don't want to?


message 26: by Philip (new)

Philip (phenweb) Ian wrote: "Notwithstanding the potential fears, why cannot such a device be made that is restricted to one-way transmission. Thus I am currently listening to a radio. While radios can be built to receive and ..."

You can certainly make devices as receivers only; however, one of the uses mentioned earlier was communication which tends to be a two way process (I'll exclude current incumbent of white house who seems to be in permanent transmit - dodgy relay I suspect on his Putin control unit - wrong thread)

Yes read only for medical data but what about if drug dosage is required that implies a transmission to deliver the drugs.


message 27: by Ian (new)

Ian Miller | 1857 comments Surely if you were wanting to control drug delivery, you would put the control on whatever was injecting the drugs? Communication is a 2-way process, but it does not involve control. Although I can see that some real thought would be required to build device that did not permit control.


message 28: by J.J. (new)

J.J. Mainor | 2440 comments Graeme wrote: "On the otherhand, I was asked the question, would I take a NI if it made me smarter? And, what would I do if all my collegues took it too, would I be willing to be left behind?..."

The same thing will hold true with genetic alterations...Once people start creating embryos designed to produce smarter children, bigger children, faster children, or whatever, it will become an arms race as people get into it just so their children aren't left behind...


message 29: by Scout (new)

Scout (goodreadscomscout) | 8073 comments We condemn Hitler for trying to genetically engineer a master race, yet it seems that that idea will soon be embraced by the masses. It was a bad idea then, and it's a bad idea now.


message 30: by Graeme (last edited Jan 08, 2019 09:57PM) (new)

Graeme Rodaughan I've got a short story coming on that incorporates neural implants and China's "social credit" rating system...


message 31: by Ian (new)

Ian Miller | 1857 comments Of course a serious objection to Hitler was he tried to get his master race by killing off everyone he thought did not qualify. That is certainly worse than designing them.

Designing humans is an interesting question. Is it wrong to have children free of really bad genetic defects that will just cause much pain and suffering? Is it wrong to have children who will live their lives cancer free? The problem is that sooner or later people will start designing for all sorts of things that we really don't need, and also while learning to do, they make some awful mistakes. It seems to me to be one of those things we shouldn't allow, but in principle could be quite valuable if used in very specific ways.


message 32: by Scout (new)

Scout (goodreadscomscout) | 8073 comments A very very slippery slope, genetic selection. A perversion of natural selection. If genetic selection had been the thing when we were born, would you be here? Would I? I don't consider myself genetically superior to anyone. What will the world look like if there's a golden mean for appearance? Will everyone begin to look alike, as they do in Hollywood? What will happen if humans deem themselves the judge of who should be allowed to pass on their genetic traits and who shouldn't? And who will be in charge? And how will that be determined? Will there be artists and writers and independent thinkers? How do you select for that? I'm glad I'm an old lady who won't have to see what happens, because I don't think it will be good overall, despite the medical advances. That's just the lure that will convince people that this is a good thing. I feel in my bones that it's wrong.


message 33: by Ian (new)

Ian Miller | 1857 comments Happy New Year to you, Scout. The points you make are all valid. The counter is that it would be controlled in such a way that your fears would not be realized. And I can hear you from here saying, "Yeah, right." It is the probability that all would not be well is why this should not be permitted.


message 34: by Nik (new)

Nik Krasno | 19850 comments To a degree, we are already influenced (or controlled?) by various subtle or blunt means, be they mass media, cell phones, just a need to earn dough or whatever.
As of genetic arms race - I guess it's interesting enough to have a dedicated thread


message 35: by Matthew (new)

Matthew Williams (houseofwilliams) Graeme wrote: "I've got a short story coming on that incorporates neural implants and China's "social credit" rating system..."

Remind me to sue China for that, and Black Mirror. Years ago, I heard about this very idea and did a short story about a social credit system, basically a reputation index that people relied on the way they do their current credit scores.

I accuse China of plagiarism! ;)


message 36: by Matthew (new)

Matthew Williams (houseofwilliams) Scout wrote: "We condemn Hitler for trying to genetically engineer a master race, yet it seems that that idea will soon be embraced by the masses. It was a bad idea then, and it's a bad idea now."

That's the fear that genomic manipulation presents. However, the applications for it are for removing genetic diseases and anticipating future health problems. Cosmetic changes are what people fear will happen, which you clearly get because you mentioned how its a "slippery slope". However, I don't think anyone would say no to genetic modifications if they found out their infant would be born with a terrible condition or was more likely to suffer from heart problems, mental illness, etc.


message 37: by Graeme (new)

Graeme Rodaughan Matthew wrote: "Remind me to sue China for that, and Black Mirror. Years ago, I heard about this very idea and did a short story about a social credit system, basically a reputation index that people relied on the way they do their current credit scores..."

I'd love to see someone successfully sue China for an IP breach.


message 38: by Graeme (new)

Graeme Rodaughan Matthew wrote: "That's the fear that genomic manipulation presents. However, the applications for it are for removing genetic diseases and anticipating future health problems.."
..."


This problem of the 'dual nature,' of technical advance..... There is a recurring theme of any advance being used for good or evil.

There may well have been someone who said something like, "Wow! I've just mastered fire, now I can warm my family when it's cold, ward off predators, and cook my food..."

and then a day later. "Ha Ha Ha... Now I can watch my enemies burn..."


message 39: by Ian (last edited Jan 10, 2019 08:14PM) (new)

Ian Miller | 1857 comments Sueing China for plagiarism would not be very profitable. Actually, Matthew, to start with you would have to show the Chinese read your story and copied it in great detail. Simply having the same idea is not plagiarism.


message 40: by Matthew (last edited Jan 10, 2019 10:49PM) (new)

Matthew Williams (houseofwilliams) Ian wrote: "Sueing China for plagiarism would not be very profitable. Actually, Matthew, to start with you would have to show the Chinese read your story and copied it in great detail. Simply having the same i..."

Yeah, that was sarcasm, Ian. Not in a million years would I assume that the state of China actually read a short story of mine and adopted policy based on it. That's why I added a wink emoji at the end.


message 41: by Matthew (new)

Matthew Williams (houseofwilliams) Graeme wrote: "Matthew wrote: "That's the fear that genomic manipulation presents. However, the applications for it are for removing genetic diseases and anticipating future health problems.."
..."

This problem ..."


Absolutely. The question is, on balance, how have technological advances affected us overall? Would we not claim that the discovery of fire, the development of the printing press, industry, personal computers and the internet all been to the benefit of humanity? Seriously, would we claim that? I know I would, but I'd like to hear counter-arguments on that front.


message 42: by Ian (new)

Ian Miller | 1857 comments Matthew wrote: "Ian wrote: "Sueing China for plagiarism would not be very profitable. Actually, Matthew, to start with you would have to show the Chinese read your story and copied it in great detail. Simply havin..."

Yeah, Matthew, I picked that. What I was curious about the emoji was the semicolon - did you intend one eye to be weeping?


message 43: by Ian (new)

Ian Miller | 1857 comments In my view, technology is neutral. It is what people do with it that is either good or bad. The printing press allowed great literature, but it also allowed Mein Kampf. The press itself was neutral - merely a large mass of metal and some ink. Same with any other technology.


message 44: by Matthew (new)

Matthew Williams (houseofwilliams) Ian wrote: "Matthew wrote: "Ian wrote: "Sueing China for plagiarism would not be very profitable. Actually, Matthew, to start with you would have to show the Chinese read your story and copied it in great deta..."

No, Ian, that's an eye winking. It's called a winky, which indicates you are kidding. And do you mean to say you "picked up on that"? Because, if you knew I was being sarcastic, why did you go into an explanation of why I couldn't sue? It genuinely seemed to me like you thought I was serious.

Either or, no harm no foul. I genuinely wish I could sue, but Black Mirror would be my only real target :)


message 45: by Ian (new)

Ian Miller | 1857 comments Why did I go into an explanation? Good question. I wanted to prod a little (sorry) then I found I had to write something to do it. Oops. :-)


message 46: by Matthew (new)

Matthew Williams (houseofwilliams) Ian wrote: "Why did I go into an explanation? Good question. I wanted to prod a little (sorry) then I found I had to write something to do it. Oops. :-)"

Hey, no worries. I would be curious too. So the issue of a social credit system, how would one pair that with brain implants. I seem to recall someone saying something about that.


message 47: by Ian (last edited Jan 11, 2019 08:17PM) (new)

Ian Miller | 1857 comments I can't see how you would pair a social credit system. In fact I am not sure what a social credit system is. When I was in Calgary a long time ago they had a social credit government, and everyone was prosperous, but then again, Calgary was awash with oil money, and that was why it was prosperous. There was a social credit part in NZ a rather long time ago but they never got anywhere. I couldn't follow their policies, not helped by a government politician offering his take on their policy - a determination to introduce a $3 bill. Somehow that stuck in my memory, and that is about all I can recall of them.


message 48: by Matthew (last edited Jan 11, 2019 08:46PM) (new)

Matthew Williams (houseofwilliams) Ian wrote: "I can't see how you would pair a social credit system. In fact I am not sure what a social credit system is. When I was in Calgary a long time ago they had a social credit government, and everyone ..."

I know exactly what you're talking about. These days, they are called the New Democratic Party, but Social Credit is how they got their start. It's very cool you got to see that, I don't get to hear from people who witnessed those days very much. The Social Credit Party was intrinsic to the history of the Canadian west and kind of got forgotten in the conservative revolution of the 1980s.

Anyway, in this case, the social credit system refers to a digital ranking system that comes down to how esteemed a person is. It's an amalgamation of a person's credit score, financial history, online behavior (do they look at porn, etc.), their posts on social media, and how other people rate their interactions with them regularly. You distill that down to a score - 1-10 or out of 100 or 1000 - and you have a person's social credit score or reputation score.

Basically, its like gossip and word of mouth on steroids, and accelerated by a factor of 100.


message 49: by Ian (new)

Ian Miller | 1857 comments In Calgary, I think at the time I never heard anything particularly unique to them - they were wallowing in oil money, and they did spend it well on good infrastructure and the health system there was cheaper than anywhere else in North America at that time. Oil money does wonders if you have it :-)


message 50: by J. (new)

J. Gowin | 7977 comments If the technology becomes available, I have two concerns:

1.) Planned obsolescence
Consider the rate of obsolescence which we have observed in personal electronics over the past few decades. Now apply that to a gadget which is effectively part of your brain. I imagine that there would be people rushing to swap out part of their cortexes the same way that they stand in line for the latest Apple iPhone. There would also be people who either refused or couldn't afford to upgrade, what would become of them? Would corporations still send mandatory OS updates that nerfed/killed older models?

2.) Social engineering
The amount of personal information that is being harvested and used against us is already staggering. From our clicks and view times the companies who are watching can deduce our preferences, education, health, finances, IQ and personality traits better than we can. While the tech is seperate from our physical bodies we can choose to walk away or limit our interactions to direct business matters. This option disappears once you are wired to it. Imagine the possibilities of a society which cannot turn off Fox News/MSNBC. Could you raise propoganda to the level of outright memetic warfare? Have we already seen the opening shots of such conflicts during recent elections?


« previous 1
back to top