This article was published as a part of the Data Science Blogathon.
Web 3 – sometimes known as “Web 3” or “Web 3.0′′ – is a word you’ve probably heard a lot recently. It simply refers to the internet’s next version, which encourages decentralised protocols and tries to lessen reliance on giant digital corporations like Youtube, Netflix, Google, and Amazon.
Web3 supporters claim that technology will transform the internet, ushering in a new, decentralised era of the internet that will be managed by ordinary people rather than major businesses.
The third generation of web technology is known as Web 3.0 (Web3). The web, often known as the World Wide Web, is the basic layer that provides website and application services on the internet.
Because Web 3.0 is continuously expanding and being defined, there isn’t a single, globally acknowledged definition. But one thing is certain: Web 3.0 will place a heavy focus on decentralised apps and will make considerable use of blockchain-based technology. Machine learning and artificial intelligence (AI) will also be used in Web 3.0 to assist enable more intelligent and adaptable apps.
The concept of a semantic web is another component of the developing definition of Web 3.0. Tim Berners-Lee, the founder of the web, is one of many who has campaigned for the incorporation of semantic technology into the web.
To understand Web 3.0, we need to have a look at the history of the Internet.
Berners-Lee, a computer scientist at CERN in Europe, was a pioneer in the early creation of the internet. His efforts led to the creation of the Internet, that is Web 1.0. HTTP (Hypertext Transfer Protocol) was invented by Tim Berners-Lee in 1989 to allow text documents to be exchanged across a network using browser software such as Safari or Chrome. This time is known as the “Read-Only Web” because individuals could only read information from websites on a broad level.
The earliest stage of the World Wide Web’s evolution is referred to as Web 1.0. In Web 1.0, there were just a few content providers, with the vast majority of users being content consumers. Personal websites were prevalent, and they mostly consisted of static pages housed on ISP-owned web servers or free web hosting services.
Image: https://www.royex.ae/blog/evolution-of-web-10-to-web-30-why-web-30-matters
Advertisements on websites while accessing the internet were prohibited in Web 1.0. Ofoto was also an online digital photography website in Web 1.0, where users could save, share, view, and print digital images. Web 1.0 is a content delivery network (CDN) that allows for the presentation of data on websites. It is suitable for usage as a personal website. It charges the user based on the number of pages seen. It features directories that allow users to search for specific information. The Web 1.0 period lasted roughly from 1991 to 2004.
Main Features of Web 1.0:
2004 Although Tim O’Reilly and Dale Dougherty organised the first Web 2.0 conference (later known as the Web 2.0 summit), Darcy DiNucci invented the phrase in 1999. Web 2.0 refers to websites that emphasise user-generated content, usability, and interoperability for end users all over the world.
The introduction of social media and its quick growth were all factors that ushered in the Web 2.0 era. The ability of web servers to interpret server-side scripts, user-generated material in the form of comments, and the use of databases to store information all contributed to this shift. Web 2.0’s exponential expansion has been fueled by important breakthroughs such as mobile Internet, social networks, and the near-ubiquity of powerful mobile devices such as iPhones and Android-powered smartphones.
Image: https://placewit.medium.com/must-do-problems-to-crack-faang-3a4143f1d2f3
The participatory social web is another name for Web 2.0. It does not relate to a change in a technical definition, but rather to a change in the way Web pages are built and used. The transition is advantageous, although that does not appear to be the case when the changes occur. Web 2.0 allows users to interact and collaborate with one another in social media discourse as creators of user-generated content in a virtual community. Web 2.0 is a development of Web 1.0.
Web 2.0 development employs web browser technologies such as AJAX and JavaScript frameworks. AJAX and JavaScript frameworks have become highly popular for developing web 2.0 sites.
Multiple web 2.0 breakthroughs paved the way for app domination in the second decade of the millennium, which greatly increased online participation and consumption. Web 2.0, on the other hand, has had a big influence on individual enterprises, and for others, it poses an existential danger.
The following are some examples of how web 2.0 users communicate and share their views, opinions, and experiences:
Web 2.0’s issue wasn’t so much with the content as it was with the framework. Web 2.0’s centralised architecture opens the door to security issues, data harvesting for malicious purposes, privacy invasion, and expense. Furthermore, the network took over data storage, causing access challenges as well as concerns about the anonymity and security of online data.
Key breakthroughs like mobile internet access and social networks, as well as the near-ubiquity of sophisticated mobile devices like iPhones and Android-powered smartphones, have fueled Web 2.0’s exponential expansion. These advances permitted the supremacy of applications that substantially extended online engagement and usefulness in the second decade of this millennium—for example, Airbnb, Facebook, Instagram, TikTok, Twitter, Uber, WhatsApp, and YouTube, to mention a few.
Image: https://www.xrtoday.com/virtual-reality/web-2-vs-web-3/
Many Web 2.0-centric firms, including Apple, Amazon, Google, Meta (previously Facebook), and Netflix, have become among the world’s largest companies by market value as a result of their remarkable revenue growth.
As a result of our contact with the internet, we now have access to critical information regarding online activity in the form of data. Companies utilise this data to create new platforms and provide targeted advertisements. They also make money off of it by selling it to others. According to theorists, this results in a situation in which consumers have little to no control over where their data is used.
Web 3.0 differs from Web 2.0 in that it focuses more on the use of technologies such as machine learning and AI to give appropriate material for each user rather than just the content that other end users have contributed. Web 2.0 enables people to contribute and participate in on-site content, but Web 3.0 will most likely delegate these tasks to semantic web and AI technology. Web 3.0 also places a heavy emphasis on decentralised services and authority, in contrast to Web 2.0’s centralization.
Berners-Lee discussed the notion of the Semantic Web in a paper published in 2001. There is no reliable technique for computers to process the semantics of the language (i.e., figure out the actual context in which a word or phrase is used). Berners Lee’s ambition for the Semantic Web was to structure meaningful material on web pages and to allow the software to do complex jobs for people.
Image: https://www.dshgsonic.com/blog/web-3-0-the-start-of-a-new-era
Web 3.0 has progressed well beyond Berners-initial Lee’s notion of the Semantic Web, which he first proposed in 2001. This is partially due to the high cost and difficulty of converting human language—with all of its various subtleties and variations—into a format that computers can understand, and partly due to the fact that Web 2.0 has already changed significantly over the last two decades.
Here, we had a look at the three generations of the internet. Reading and accessing information were the primary goals of Web 1.0. It was all about reading and writing data, as well as producing content, in Web 2.0. Reading, publishing, and owning data are all important aspects of Web 3.0.
The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.