Negobot keeps track of its conversations with all users, both for future references and to keep a record that could be sent to the authorities if, in fact, the subject is determined to be a paedophile. The bot gives off only brief, trivial information, including name, age, gender and hometown.
If the subject wants to keep talking, the bot may talk about favorite films, music, drugs, or family issues, but it doesn’t get explicit until sex comes into the conversation.
It occurred to me that these scripts had a connection to ELIZA, one of the earliest examples of a natural language processing program.
Building on that kind of information, a new chat bot will serve as a virtual Lolita, posing as a 14-year-old schoolgirl, with the aim of lulling paedophiles into thinking they’re human and thus making it easier for law enforcement to intercept them in chat rooms.
She rose to the lofty heights of Executive Editor for e WEEK, popped out with the 2008 crash and joined the freelancer economy.
Alongside Naked Security Lisa has written for CIO Mag, Computer World, PC Mag, IT Expert Voice, Software Quality Connection, Time, and the US and British editions of HP's Input/Output.
Is this starting to sound uncomfortably like entrapment? John Carr, a UK government adviser on child protection, told the BBC that overburdened police could be aided by the technology, but the software could well cross the line and entice people to do things they otherwise might not: The BBC reports that Negobot has been field-tested on Google chat and could be translated into other languages.
Its researchers admit that Negobot has limitations – it doesn’t, for example, understand irony.