<< Where have all the Humans Gone? Long Time Passing>>

—— (dis)Location of (dis)Information ——

Are there three orders of what we could label as “Location Jailbreak?” While seemingly unrelated, these orders of Location Jailbreak could be a category of workarounds that enable humans to confuse other humans about:

  1. (dis)location-information,
  2. location of (non-)(-dis)information,
  3. locating a human,
  4. locating the humane or
  5. locating what is or isn’t human?

Most graspable, present-day instances could be labeled (with a wink) as a Second-order Location Jailbreak such as this one: Google Maps Hacks : lots of “humans” on location, and yet where have all the humans gone? I think to categorize virtual private networks within this order as well. Instances such as discussed in the 3rd reference here below could fit this set too: “fooling social media platforms AI recommendation algorithms by streaming from the street near rich neighborhood… fake GPS…” (Polyakov 2023): lots of humans but where has the location gone?

The First-order Jailbreak (with a wink): the digging of a physical tunnel out of an enclosed location, resulting in dirty fingernails and the temporarily assumed continued presence by unassuming guards which then is followed by erratic gatekeepers once the dis-location has been unveiled. Lots of humans at one location but where has that one gone?

The Third-order disturbance of “location” (again with a wink-of-seriousness) could be at a meta-level of human perception and of loss of ability to accurately locate any human(e), due to the loss of “truth,” destabilized sense of reality, and the loss of the historic human-centeredness (on a pedestal): an example is our collective reaction to “DAN” or “Sydney,” or other telepresenced, “unhinched” alter-ego’s dislocated “in” LLMs & triggered via prompt-finesse / -manipulation / -attacks. This order of confusion of location is also symbolized in a *.gif meme of a Silverback gorilla, who seems to be confusing location of another gorilla with his reflection in a mirror. The LLM (e.g., many an iteration of chatbots) is the human mirror of meta-location confusion. Lots of dislocations and (de)humanization, and where has the humane gone?

Here, in this Third-order Jailbreak, we could locate the simulacra of both location and of the human(e).

—-•
Sources

1.
Introducing the AI Mirror Test, which very smart people keep failing

2.
The clever trick that turns ChatGPT into its evil twin

3.
A February 2023 post on Twitter via LinkedIn and Alex Polyakov

4.
an interesting take via Twitter on The Verge article covering the AI Mirror Test

5.
My first consideration on Twitter of the three orders of jailbreaks.

6.
Naomi Wu’s post on Twitter as referenced by Alex Polyakov on LinkedIn

7.
Simon Weckert reporting on the Google Map Hack.