The AI Thread

Indeed, this will only help pave the way for a more authoritarian internet.
And push more people to darkweb services.
I think the tendency you highlight exists, but the difference between "running an OS you control" and "going to the darknet for your internet services" is pretty large.
 
I think the tendency you highlight exists, but the difference between "running an OS you control" and "going to the darknet for your internet services" is pretty large.

It just shows that I don't know the first thing about the darknet.

Or maybe that is what I wanted people to believe ^_^ (but no, I don't)
 
Apple is about to announce a new technology for scanning individual users' iPhones for banned content. While it will be billed as a tool for detecting child abuse imagery, its potential for misuse is vast based on details entering the public domain.
This will likely mean neural network will scan a phone for suspicious content, which then will be verified by human moderators to reduce false positives.
There will be a monitoring team somewhere, going through people's private pictures. Not all of them, but the most 'interesting'.
Apple may be shooting itself in a foot by such announcements.
 
Apple may be shooting itself in a foot by such announcements.
It seems they are totally shooting themselves in the foot, and presumably for no good reason. I assume that most people know that kiddie porn is illegal and take measures to protect themselves. I am convinced it is a "look we are doing something, do not regulate us" move, rather than actually trying to make the world better.
 
Ultimately, such a tech won't stop any illegal content. People could just upload the images altered beyond recognition, then include reference to a key - and yes, this "key" can also be a material one, such as a kaleidoscope; so the image itself isn't recognized as the content regardless of whether it is accessible or not.
 
Ultimately, such a tech won't stop any illegal content. People could just upload the images altered beyond recognition, then include reference to a key - and yes, this "key" can also be a material one, such as a kaleidoscope; so the image itself isn't recognized as the content regardless of whether it is accessible or not.
Yeah, and the obvious way to "alter" them is with asymmetric cryptography. People probably should be doing this with every bit of data they have, and certainly every bit of data they send.
 
Images can be sent encrypted or distorted, but if they are viewed on the phone, illegal stuff technically can be detected.
 
Images can be sent encrypted or distorted, but if they are viewed on the phone, illegal stuff technically can be detected.
Yeah, if you do anything on an OS you did not install from source then it "technically can be detected". The illicit substances crowd figured that out decades ago, I would have thought the kiddie porn crowd would have too.
 
Yeah, and the obvious way to "alter" them is with asymmetric cryptography. People probably should be doing this with every bit of data they have, and certainly every bit of data they send.

I think the color mixing metaphor is nice :)


Cryptography (in an ancient form) was notably used by Sparta. They had, what was called, the "cryptia", where a message was first written on a standard piece of material, and contained either meaningless or trivial sentences. Then the paper could be arranged in a specific way on a wooden rod, making the relevant/intelligible letters form sentences in a different slope.
Of course the system relied on officers knowing about the decryption, which was symmetric/singular.

Images can be sent encrypted or distorted, but if they are viewed on the phone, illegal stuff technically can be detected.

Not if on your phone the image is seemingly random distortions, but you then use a (material, in my example) caleidoscope tuned in a specific way or in a small number of ways. Of course the latter is also symmetric, but outside the electronic system (it could also be tied to some known tunings of a caleidoscope, which are used for regular stuff, and the object itself can't be banned either). There can be variations and intermixing with asymmetric means too :)
 
That said, @Samson , I do wonder whether some algorithm can be found to help you identify even a very large (hundreds of digits, afaik, are currently used) number as the product of two primes. I feel that it's not theoretically impossible for this to happen.
 
That said, @Samson , I do wonder whether some algorithm can be found to help you identify even a very large (hundreds of digits, afaik, are currently used) number as the product of two primes. I feel that it's not theoretically impossible for this to happen.
As I understand it, they have developed the algorithm, but not the hardware to run it on. You need something like the same number of qubits as the number has binary digits.
 
As I understand it, they have developed the algorithm, but not the hardware to run it on. You need something like the same number of qubits as the number has binary digits.

I actually came across that a couple of days ago (maybe the video was very recently uploaded, or was promoted by the youtube algorithm), but yes, that would require quantum computing (?) or unrealistically powerful classical computing.


I was thinking of something less flashy, tbh :D I mean... he talks about "1 billion steps in the computation". I intuitively was imagining something far less taxing and more about unknown properties of the natural number sequence/primes.
 
I actually came across that a couple of days ago (maybe the video was very recently uploaded, or was promoted by the youtube algorithm), but yes, that would require quantum computing (?) or unrealistically powerful classical computing.

I was thinking of something less flashy, tbh :D
New maths is less flashy than simple engineering? What kind of philosopher are you? :P
 
Not if on your phone the image is seemingly random distortions, but you then use a (material, in my example) caleidoscope tuned in a specific way or in a small number of ways. Of course the latter is also symmetric, but outside the electronic system (it could also be tied to some known tunings of a caleidoscope, which are used for regular stuff, and the object itself can't be banned either). There can be variations and intermixing with asymmetric means too :)
Impractical, comparing to simply using trusted OS. Nobody will bother with these caleidoscopes.
Edit: Besides, these distorted images may also be detected and labeled as suspicious by detection software.

That said, @Samson , I do wonder whether some algorithm can be found to help you identify even a very large (hundreds of digits, afaik, are currently used) number as the product of two primes. I feel that it's not theoretically impossible for this to happen.
https://en.wikipedia.org/wiki/Shor's_algorithm
No known classical version, with polynomial time.
 
New maths is less flashy than simple engineering? What kind of philosopher are you? :p

Engineering is flashy, math/philosophy should be about going deeper, where there is limited source of light ^_^

Impractical, comparing to simply using trusted OS. Nobody will bother with these caleidoscopes.
Edit: Besides, these distorted images may also be detected and labeled as suspicious by detection software.


https://en.wikipedia.org/wiki/Shor's_algorithm
No known classical version, with polynomial time.

Yes, the Shor video (with Shor himself) is what I posted :D

As for the outside-the-system means, like the humble caleidoscope, you should keep in mind that there are literally infinite different ways to tune it, which cancels the ability to term an image in the computer as suspicious (unless you have info on a limited number of tunings of the caleidoscope/similar). It is, in effect, the Enigma machine's (not great, not terrible) way of handling coding with a public or accessible key (tuning of the caleidoscope) and a starting non-trivial setting (distorted image). But as I said those can be fused with asymmetric methods. A better question would be whether someone who only cares about child porn would bother with such stuff.
 
Last edited:
Apple to use AI to look for kiddie porn on your phone, but what else will they look for?

Apple is about to announce a new technology for scanning individual users' iPhones for banned content. While it will be billed as a tool for detecting child abuse imagery, its potential for misuse is vast based on details entering the public domain.

Rather than using age-old hash-matching technology, Apple's new tool – due to be announced today along with a technical whitepaper, we are told – will use machine learning techniques to identify images of abused children.

Governments in the West and authoritarian regions alike will be delighted by this initiative. What's to stop China (or some other censorious regime such as Russia or the UK) from feeding images of wanted fugitives into this technology and using that to physically locate them?​

Governments can do that and more. If it can detect images, it can necessarily also be configured to detect any set of words, phrases etc as well...and do so regardless of app used. Then filter and alert based on those.

Protecting children is important. Consider me doubtful of big tech, however, when sites like Facebook are still way up there for usage in human trafficking to this day (I think it might even be the worst, by volume?). Seems like nearly every time these corporations or politicians say "think of the children", they're doing so while laying the necks of children and adults alike on a chopping block.

Might not be long before places like China start arresting people even before protesting based on phrases on phones etc. Hard to organize anything like that. The west will be better, but probably not by much.
 
Apples kiddie porn thing is not that good AI, these 2 images produce the same hash:

129860794-e7eb0132-d929-4c9d-b92e-4e4faba9e849.png
129860810-f414259a-3253-43e3-9e8e-a0ef78372233.png


This demonstrates that one can reverse engineer images to have hashes that match kiddie porn. If you sent 2 to a "mark" you effectively digitally swat them.

El Reg Github Source
 
I'd imagine that the reverse also can be constructed easily; the dog morphs into something illegal using some algorithm.
At least this would allow one to flood the web with images that are picked up as suspicious, but only a tiny fraction of those can be translated to something illegal.
 
Nice. Can you explain what is happening? (is it really morphed from "one" image, or is this more trivial)

By the way, in the third attempt I got this one, which has shades of Paul Klee ^_^

upload_2021-8-19_20-59-39.png
 
Back
Top Bottom