Some call it “scraping,” others call it “importing.” Either way, it’s a controversial process pitting independent software developers against the titans of the cyber world: Techies compile, or scrape, loads of data from search engines and social networking sites and pool the data on their own websites, Wired reports. Some companies, relishing the increased traffic, love the service.
But others, charging copyright infringement, are cutting the web ties that allow developers access to their data and have begun developing ways to control how their content is distributed. Many companies like Google use interfaces that formally regulate how much data developers can use. One insider warns that large firms will begin competing with small developers, sparking “an unfair fight.”