RT @j2bryson: As we explained in 2017 https://t.co/bag5BaCCoW the other two co authors are law professors so yes you can use this in your c…
RT @j2bryson: As we explained in 2017 https://t.co/bag5BaCCoW the other two co authors are law professors so yes you can use this in your c…
RT @j2bryson: As we explained in 2017 https://t.co/bag5BaCCoW the other two co authors are law professors so yes you can use this in your c…
As we explained in 2017 https://t.co/bag5BaCCoW the other two co authors are law professors so yes you can use this in your cases. #AIEthics #anthropomorphism
@thebrainofchris As we explained in 2017 https://t.co/bag5BaCCoW the other two co authors are law professors so yes you can use this in your cases.
@favourboroks anthropomorphism is a problem when a) it involves dehumanising actual humans (so putting technology investment above e.g. social justice rather in the service of social justice) or b) avoiding responsibility https://t.co/bag5BaCCoW
#AIEthics is NEVER about obligations of machines. https://t.co/bag5BaCCoW I'm talking about the obligations of those who manufacture, and those who sell weapons systems, given that these all have AI in them now. I assume buyers demand cybersecure control
@thehertieschool sorry I forgot to tag you! If you are on mastodon, I can edit to tag you there :-)
So grateful for the opportunity and how lovely to meet my professor again!!
It’s Alum weekend at Hertie School! Kate Yang @adcolors helped translate my article “Of, for, and by the people: the legal lacuna of synthetic persons” into Chinese, AND brought me a copy of the book. Thanks! https://t.co/bag5BaCCoW https://t.co/ljPLImtP
RT @j2bryson: @binocularity @lilianedwards The EU rightly attributes responsibility for all action to either the provider of the AI capacit…
@binocularity @lilianedwards The EU rightly attributes responsibility for all action to either the provider of the AI capacity or the deployer of applications containing that capacity (or possibly indvidual end users misuing applications). cf https://t.co/
Really interesting article on the desirability (or otherwise) of legal personhood for AI/robots
@JeniT Thanks Jeni (& @azeem for directing me here.) Obviously fully agree since I've been arguing this way for years. For any of your skeptical readers, here is a paper with two very impressive law professors, open access, on why we MUST #deanthropomo
@albert_sabater @sharongoldman @elonmusk @GaryMarcus @stevewoz @timnitGebru @emilymbender @random_walker @AlexCEngler @geomblog I agree with you that the #climateCrisis and #sustainability is our greatest challenge. But getting on top of #digitalGovernance
RT @j2bryson: @gabramosp @UNESCO Of course by "book" I mean actually of course this article with @Tom_Grant_2012 & @ProfDiamantis https://t…
@gabramosp @UNESCO Of course by "book" I mean actually of course this article with @Tom_Grant_2012 & @ProfDiamantis https://t.co/bag5BaCCoW
Of, for, and by the people: the legal lacuna of synthetic persons - Artificial Intelligence and Law https://t.co/tMzZZv2WPN
@NiNanjira Good paraphrase:-) here’s the paper https://t.co/bag5BaCCoW what brought this up again?
@gentile_giulia https://t.co/bag5BaCCoW with @ProfDiamantis @Tom_Grant_2012 remind me to send the legal chapter when my laptop is out
RT @EvilAICartoons: An interesting paper on this legal personhood of AI: https://t.co/nagez4GFho
An interesting paper on this legal personhood of AI: https://t.co/nagez4GFho
@sean_welsh77 @David_Gunkel @jadelgador @mitpress @SSRN or more academically & legally https://t.co/bag5BaCCoW
Here’s some background literature by Joanna Bryson @j2bryson and colleagues: 'Of, for, and by the people: the legal lacuna of synthetic persons' https://t.co/4VUnuRpBly 'Patiency is not a virtue: the design of intelligent systems and systems of ethics'
@phoebemoore @jscottwagner @emilymbender this is the piece I used to set on this https://t.co/x5oJnKx2CU (not by a lawywer!) i think it hits the main points tho I disagree with her section 3 :)
@iphigenie @morungos @martingoodson I like your reasoning generally, but it's not the software that skips responsibility, it's the organisations that develop & operate it that aren't held responsible to show due diligence, which they should be. https:/
@jedichaz No, computers don't have ambition or take control. This was the incompetence of the Post Office, @fujitsu_uk, and the UK Government. We need to give particularly our governments & NGOs (like @algorithmwatch) competence to recognise & addr
@SurviveThrive2 @David_Gunkel I don't think self sustenance is really the issue (grass & microbia do that) but I do fully agree that something truly human like should never be owned or designed. But I think your (admirable) empathic intuitions could be
RT @dmonett: Great I found @j2bryson's piece again, juggling w/ TW and LI. "[D]ifficulties in holding 'electronic persons' accountable whe…
For those new to this topic Tom, Mihailis & I have a significant publication in this area: https://t.co/bag5BaCCoW
RT @dmonett: Great I found @j2bryson's piece again, juggling w/ TW and LI. "[D]ifficulties in holding 'electronic persons' accountable whe…
Great I found @j2bryson's piece again, juggling w/ TW and LI. "[D]ifficulties in holding 'electronic persons' accountable when they violate the rights of others outweigh the highly precarious moral interests that AI legal personhood might protect." #AI h
RT @skydog811: Of, for, and by the people: the legal lacuna of synthetic persons - Artificial Intelligence and Law https://t.co/FnyHV0uKea
RT @skydog811: Of, for, and by the people: the legal lacuna of synthetic persons - Artificial Intelligence and Law https://t.co/FnyHV0uKea
RT @skydog811: Of, for, and by the people: the legal lacuna of synthetic persons - Artificial Intelligence and Law https://t.co/FnyHV0uKea
RT @skydog811: Of, for, and by the people: the legal lacuna of synthetic persons - Artificial Intelligence and Law https://t.co/FnyHV0uKea
Of, for, and by the people: the legal lacuna of synthetic persons - Artificial Intelligence and Law https://t.co/FnyHV0uKea
@David_Gunkel @RobotRules @ProfDiamantis It might have been @Tom_Grant_2012 but anyway see abstract here — certainly not my phrasing https://t.co/bag5BaCCoW
@gfodor @jonst0kes We want to encourage corporations to make the safest, most maintainable products they can. https://t.co/vOKx2ixPwL
Some have argued that AI systems should have legal personhood akin to corporations, though this idea has its critics (e.g. see paper by @j2bryson et al). https://t.co/nagez4GFho
@markhughes @alexstamos Certainly technologies can’t be assigned responsibility:) https://t.co/bag5BaCCoW
@altrishaw @WomeninAIEthics @AlaricAloor @WLinAI Of course all persons are non binary :-) https://t.co/vOKx2ixPwL
RT @fial_la: Personalidad jurídica de los seres basados en inteligencia artificial: ¿avance necesario, ficción inconveniente o decisión pel…
Personalidad jurídica de los seres basados en inteligencia artificial: ¿avance necesario, ficción inconveniente o decisión peligrosa? AAVV, "Of, for, and by the people: the legal lacuna of synthetic persons", 2017. https://t.co/ppWDJbPuaP
RT @j2bryson: @mireillemoret @Faust_III @jksmith8806 @grok_ @thehertieschool Corporations (& idols & rivers) have legal personality because…
RT @j2bryson: @mireillemoret @Faust_III @jksmith8806 @grok_ @thehertieschool Corporations (& idols & rivers) have legal personality because…
@mireillemoret @Faust_III @jksmith8806 @grok_ @thehertieschool Corporations (& idols & rivers) have legal personality because there are people composing or dependent on them who can defend those rights and observe / know their responsibilities. Als
#embeddingAI IMO legal agency has already been overextended, cf @KatharinaPistor book. Treating AI as responsible agents, rather its owner/operators should be, unless there’s a defect in its construction, in which case, the developer / manufacturer. cf htt
@Inframethod now I do a lot of work on policy with the aim of making the world more stable and livable, see e.g. https://t.co/OiA8aHFm7N where I specifically say "don't get bogged down defining AI, just regulate software". See also for example https://t.c
RT @j2bryson: @aimeevanrobot @Boring_AI @RespRobotics @carissaveliz There are arguments for *legal* (not moral) agency out of efficiency fo…
RT @j2bryson: @aimeevanrobot @Boring_AI @RespRobotics @carissaveliz There are arguments for *legal* (not moral) agency out of efficiency fo…
@aimeevanrobot @Boring_AI @RespRobotics @carissaveliz There are arguments for *legal* (not moral) agency out of efficiency for contracts & liability. We dismantle those here https://t.co/bag5BaCCoW and see also @KatharinaPistor's recent book concerning
@OliverBridge12 @aimeevanrobot We can only barely make other social species (with many similar evolved traits) moral agents e.g. in our households, not in law (tho they do get moral patiency through our shared identity). With artefacts, all forms of moral
@cerb20 @karmel80 This language choice is deliberate and certainly useful for ensuring accountability in both governments and industry. See for example https://t.co/bag5BaCCoW
@UrbanDebris @LordElend @BVLSingler @djleufer @MCoeckelbergh @hermannbella @MeineckeLisa @DrDihal @stephenjcave @NoelSharkey @rodneyabrooks @aimeevanrobot if you are taking stretches then you might also want this on legal personality https://t.co/vOKx2ixPw
RT @j2bryson: @conitzer @vdignum @BennieMols @nrc Yes, could easily be made a legal person, so we need to defend against that https://t.co/…
"difficulties in holding “electronic persons” accountable when they violate the rights of others outweigh the highly precarious moral interests that AI legal personhood might protect." https://t.co/YxvbwJPFj7
The "robot on the board" was just a PR exercise. You can't actually have that, see https://t.co/vOKx2ixPwL but I agree the real risk is what people BELIEVE they can do with AI, which is why I love your talk :-)
@rodakker Sorry, no. As soon as we separate responsibility from humans we run into all kinds of problems, whether you want to call them moral or not. Here's another paper if you're done with the previous one https://t.co/bag5BaCCoW
cf Of, for, and by the people: the legal lacuna of synthetic persons https://t.co/Ep6GZIf4ky #politicoai #ruleoflaw #malta @BartoloClayton
RT @j2bryson: @NovaZastava AI itself can't have liability https://t.co/vOKx2ixPwL this is why OECD & G20 nations have signed up to making s…
@NovaZastava AI itself can't have liability https://t.co/vOKx2ixPwL this is why OECD & G20 nations have signed up to making sure it transparently communicates accountability to the human agencies that develop, operate, and/or own it. https://t.co/2lyT5
Sorry, I was unclear -- @tom_grant_2012 is one of the authors of the book, and @ProfDiamantis and I were coauthors with him on another paper, https://t.co/vOKx2ixPwL
@livefromtheabys This is how Bryson frames the anti-robot rights argument in her Legal Lacuna paper: that it opens legal loopholes and liability shields for corporations to exploit. She gives some standard legal examples, none of which are unique to robot
RT @j2bryson: Artefacts holding property = completely synthetic legal persons would be the ultimate shell company, impossible to dissuade.…
RT @j2bryson: Artefacts holding property = completely synthetic legal persons would be the ultimate shell company, impossible to dissuade.…
RT @j2bryson: Artefacts holding property = completely synthetic legal persons would be the ultimate shell company, impossible to dissuade.…
RT @j2bryson: Artefacts holding property = completely synthetic legal persons would be the ultimate shell company, impossible to dissuade.…
RT @j2bryson: Artefacts holding property = completely synthetic legal persons would be the ultimate shell company, impossible to dissuade.…
RT @j2bryson: Artefacts holding property = completely synthetic legal persons would be the ultimate shell company, impossible to dissuade.…
Artefacts holding property = completely synthetic legal persons would be the ultimate shell company, impossible to dissuade. Legal personality is already overextended, damaging #ruleoflaw. We should roll it back. https://t.co/bag5BaCCoW #aiethics @ProfDiam
@Ki_ligenz @johnchavens Because I’m trying to keep society robust enough to handle challenges and opportunities like climate change https://t.co/bag5BaCCoW By the way my position is radical not conservative for someone with a phd in AI.
@MarisaTPP @lajlafetic @denkfabrik_bmas Well, you can do some interesting moral philosophy thought experiments, but you expose yourself to enormous threat of the ultimate shell companies https://t.co/bag5BaCCoW See also my bog.
and here's Bryson's deflation of excitable demands to grant personality to robots (though I can't myself feel enthusiasm for legal fictions) https://t.co/r9rftXH4mC
RT @j2bryson: @smakelainen @jimbomorrison @charlesarthur @m_c_elish Without dissuasion errors get repeated. Look at Nixon, look at the 2008…
RT @j2bryson: @smakelainen @jimbomorrison @charlesarthur @m_c_elish Without dissuasion errors get repeated. Look at Nixon, look at the 2008…
@smakelainen @jimbomorrison @charlesarthur @m_c_elish Without dissuasion errors get repeated. Look at Nixon, look at the 2008 crash. We don’t want the law to motivate creating systems too complex to control; we want to motivate transparency & accountab
@LeConcurrential There's no way to reliably punish algorithms themselves. Here's a formal publication about that https://t.co/vOKx2ixPwL it should consolidate your intuitions.
@sentientism @elanazeide @Floridi @mitpress Thinking is useful, overextension of legal rights only generates corruption though. Short read https://t.co/K6Y3RClkUk actual display of academic expertise & more importantly, useful details https://t.co/b
@Maperez324 @spillteori @Abebab cf https://t.co/vOKx2ixPwL and https://t.co/kuhJOKYsNo if you haven't already (I edited out all the people I KNOW already have heard about these papers, sorry if I should know you have.)
@SurviveThrive2 @nbanteka It wasn't legislation it was an EP recommendation that was rejected by the EC but yeah, that's fully quoted & cited on the first pages of https://t.co/vOKx2ixPwL
@nbanteka Here's the Malta thing https://t.co/hrQvkOFVXw and again my own paper https://t.co/vOKx2ixPwL (realised I accidentally replied to @BrettFrischmann not you!)
@BrettFrischmann @nbanteka Ha, no recent cases I know about except there was an argument in Malta I'm not sure the outcome of, and there's one in Estonia we won. cf my & colleagues 2017 paper "inspired by" the European Parliament, https://t.co/vOKx2ixP
RT @j2bryson: @teemu_roos @PrivacyFin @aram Obviously I think you should read https://t.co/vOKx2ixPwL but also @FrankPasquale has some grea…
RT @j2bryson: @teemu_roos @PrivacyFin @aram Obviously I think you should read https://t.co/vOKx2ixPwL but also @FrankPasquale has some grea…
@teemu_roos @PrivacyFin @aram Obviously I think you should read https://t.co/vOKx2ixPwL but also @FrankPasquale has some great stuff, see his google scholar page, maybe https://t.co/EAhiwYXt9B ?
RT @j2bryson: @kerstingAIML @teemu_roos @kaliouby @wef @AIESConf I've written about that extensively. Here from a law perspective (with lea…
@kerstingAIML @teemu_roos @kaliouby @wef @AIESConf I've written about that extensively. Here from a law perspective (with leading legal experts) https://t.co/bag5BaCCoW here from a philosophical & design perspective https://t.co/kuhJOKYsNo This is not
@EyeOnThePitch @DrPehar @evansd66 I can’t say it better than I said it here: https://t.co/bag5BaCCoW Artifacts aren’t just “things” they are definitionally “things people designed”, that category makes them non peer. To read me saying it over & over ag