Why even the most advanced AI might not be as useful as you think
Say you had been able to make a general AI. Given the level of intelligence, it has you would have thought booking you a holiday would be a simple task. And it would be if it weren’t for robots.txt and the terms and conditions websites have against non-human agents using their services.
Ryanair for example forbids booking using any automated system.
That means that although your AI is quite capable of booking your holiday it would actually break the terms and conditions of your holiday for it to do so.
Will this be an issue?
It does cause an issue, for the time being, lots of companies have to employ low paid workers to crawl through sites in a way that a simple spider program could do with far more efficiency just because the websites that would be scraped forbid that scrapers are used.
It doesn’t matter if the websites know that the agent opening their website is a human or not, there will always be ways to fake being human online. The issue arises more from the legal challenges that any company which would do this on a mass scale would face. Even if there are no laws against breaking the terms and conditions of your implicit agreement with LinkedIn. Microsoft will come after you if they see you either as an example to made or a potential pot of gold to mine.
The reason websites do this is simple, LinkedIn tries to provide the information you might want to scrape at a price. If you can just crawl through the site unimpeded then you will have less incentive to pay them for the information. This is particularly against their interest if you then sell that information as you are then in direct competition with them.
Facebook stops people from crawling partly for similar reasons of its value of holding a monopoly of the information but also in an attempt to stop services like Cambridge Analytica from being able to run wild on the data from its service.
Comments
Post a Comment