An Extract from my Paper "The Socially Acceptable Social Networks: A Reality Check for Information Engineering on the Internet"
Working Copy of the Paper — National Journal for Computer Science and Technology, Volume 5, Issue 1 (2014) - Narsinbhai Institute of Computer Studies and Management, Gujarat, India
This applies to any form of temporary or virtual information divergence. Though this would apply to mainly the entry points on the internet like search engines, social networks, directories, or e-commerce listings. Although this can be applied to larger sets of data, it can also be applied to logically related subsets. One example would be to deduce and apply context to each type of information search. This would ensure that there is a clear demarcation of the type of data actually available. Another aspect would be to use the behavior of the user to induce relevance to a user workflow. This aspect has to be implemented through configuration or centralized information exchanges between various participating or critical websites. In general, a repository that could hold every netizen’s usage habits or information would be the perfect sub-solution to bifurcation.
There are enough mechanisms in place to loosely prevent out of track information to be added to a given set. This applies to simple form data, images, documents, videos, music, and even multiple other data forms. But using more efficient robotic analysis, we should be able to prevent multiple situations such as terrorism, adult content, religious violence, and racist remarks. These rules should be made available through an efficient regulatory body depending upon the type of the site. Also, this would help in reducing information chaos by not allowing unwanted, automated, or non-context information to build up. Apart from this, this would clearly help in easing the process of Bifurcation of information. Regulated Prevention would also mean that we are able to provide a uniform barrier in the cyber society.
The simplest and most remedial measure to counter multiple information issues on the internet is to introduce more effective cyber policing. This needs to be enforced first by making available relevant information for reporting to the user. Also, an ordinary user must have ready tools and links to use reporting mechanisms. These mechanisms need to be appropriately placed at locations that are accessible and can be followed to create detailed descriptions for more analysis. Also, by creating a newer wing of current cyber reporting or analysis mechanisms that are swifter in action, whether online or on-ground — we are removing fear from the users.
Governments all across the world, especially while forming cyber laws for the newer internet have to now follow stricter registration procedures for the internet. This may include web startups, dotcoms, information sites, social networks, or even academic websites. This will help in curbing non-genuine sites, fraudsters, scammers, and malware sites. By allowing city, state, zonal, offline, online, and national ombudsmen; we will be able to quickly enforce policing. Apart from these, by allowing a dedicated team of internet information providers, that track live information; we will be able to classify information with much ease. Also, as time progresses, we should not allow a very easy domain registration and hosting process — that does not involve the ombudsman. The additional task of each ombudsman office would be certification, classification, ranking, and maintaining historical data of its zone.
Software Engineers and Architects should use novel practices to create more relevance in the type of data that is accessible to the user. With stricter laws in place, we should provide only the right type of data operating within a context. Apart from this, the engineers should be able to maintain a pool of intelligent information within their own applications or software. This will allow them to retire data sets that may not keep in line. They may also choose to provide this data to other developers. By automating most of the other tasks that are related to bifurcation, prevention, classification, and reporting — they would be engineering more relevant software and thereby the internet. Software and Internet Architects will be required to co-ordinate and also obtain relevant certifications for their own properties. They also need to envision a way wherein the software implementers can implement all of the required policies with the greatest ease. The software would also need to have standard integration points to quickly circulate information around common or related websites.
We can classify the problems that are solved as under and curb information mess and also curb anti-patterns from forming.
Irrelevance, Non-Context Info, Malware, Adware, Ad Driven Linking…
Underage, Adult, Unsolicited Requests, Religious Violence, Profanity, Racism…
Bullying, Financial Frauds, Misleading Mails, Chain Mails, Terrorism, Frauds…
Pay for Click, Pay for Visit, Unsolicited Meeting, Phishing, Scam…
Identity Theft, Misconduct, Irrelevance, Racism, Threaten…
I envision that the future of the internet, cyber laws, information security, and governance and will be very closely based on these ideas. I term these principles as either of these, depending on the point of view: Information Chaos/Information Engineering/Information Sanity Principles.