Rabu, 13 Juni 2018

Sponsored Links

The difference between the internet and world wide web - YouTube
src: i.ytimg.com

World Wide Web (abbreviated WWW or Web ) is an information space where documents and other web resources are identified by Uniform Resource Locators (URLs), mutually linked to hypertext links, and accessible via the Internet. British scientist Tim Berners-Lee invented the World Wide Web in 1989. He wrote the first web browser in 1990 while working at CERN in Switzerland. The browser was released outside CERN in 1991, first to another research institute starting in January 1991 and to the general public on the Internet in August 1991.

The World Wide Web has become the center of the development of the Information Age and is the main tool of billions of people used to interact on the Internet. Web pages primarily are text documents that are formatted and annotated with Hypertext Markup Language (HTML). In addition to formatted text, web pages may contain components of images, video, audio, and software provided in the user's web browser as a coherent multimedia content page.

The inserted link allows the user to navigate between web pages. Some web pages with common themes, common domain names, or both, create websites. Website content is mostly available to publishers, or interactively where users contribute content or content depending on their users or actions. Websites may be mostly informative, especially for entertainment, or for the most part for commercial, government, or non-governmental organization purposes.


Video World Wide Web



History

Tim Berners-Lee's vision of a global hyperlink information system became possible in the second half of the 1980s. In 1985, the global Internet began to breed in Europe and the Domain Name System (where the Uniform Resource Locator was built) was formed. In 1988 the first direct IP connections between Europe and North America were made and Berners-Lee began to openly discuss the possibility of web-like systems at CERN. In March 1989 Berners-Lee issued a proposal to management at CERN for a system called "Mesh" referring to INQUIRE, a database and software project he built in 1980, which used the term "web" and described more complex information management. systems based on links embedded in readable text: "Imagine, then, references in this document are all associated with the network address of what they are referring to, so when reading this document you can jump to them with a single click of the mouse." Such a system, he explained, can be referred to using one of the meanings of the word hypertext, a term he was coined in the 1950s. There is no excuse, the proposal continues, why hypertext links can not include multimedia documents including graphics, speeches and videos, so Berners-Lee goes on to use the term hypermedia .

With the help of his colleague and fellow hypertext enthusiast Robert Cailliau he published a more official proposal on November 12, 1990 to build a "Hypertext project" called "WorldWideWeb" (one word) as "web" of "hypertext documents" for viewing by "browsers "using client-server architecture. At this point HTML and HTTP are already under development for about two months and the first Web server about a month from completing the first successful test. This proposal estimates that a read-only web will be developed within three months and will take six months to achieve "the creation of new links and new material by the reader, [so] authorship becomes universal" as well as "automatic notification of the reader when new material interesting for him is available. "While read-only goals are met, the authors of web-accessible content take longer to mature, with wiki, WebDAV, blog, Web 2.0 and RSS/Atom concepts.

This proposal was modeled after the reader SGML Dynatext by Electronic Book Technology, a spin-off from the Institute for Research of Information and Scholarship at Brown University. The Dynatext system, licensed by CERN, is a key player in the extension of SGML ISO 8879: 1986 to Hypermedia in HyTime, but is considered too expensive and has an inappropriate licensing policy for use in the general high-energy physics community, the cost for each document and each document changes. NeXT Computer was used by Berners-Lee as the world's first web server and also to write the first web browser, WorldWideWeb, in 1990. On Christmas 1990, Berners-Lee has built all the necessary tools for a functioning Web: the first web browser also web editor) and web server first. The first website, which describes the project itself, was published on December 20, 1990.

The first web page may disappear, but Paul Jones of UNC-Chapel Hill in North Carolina announced in May 2013 that Berners-Lee gave him what he said was the oldest known web page during his 1991 visit to UNC. Jones kept it in a magneto-optical drive and on his NeXT computer. On August 6, 1991, Berners-Lee published a brief summary of the World Wide Web project in the newsgroup alt.hypertext . This date is sometimes confusing with the public availability of the first web server, which has occurred several months in advance. As another example of such confusion, some media reported that the first photo on the Web was published by Berners-Lee in 1992, a picture of the CERN home band, Les Horribles Cernettes taken by Silvano de Gennaro; Gennaro has denied this story, writing that the media "really distorts our words for cheap sensationalism."

The first server outside Europe was installed at Stanford Linear Accelerator Center (SLAC) in Palo Alto, California, to host the SPIERS-HEP database. The account is very different from the date of this event. The World Wide Web Consortium timeline said December 1992, while SLAC itself claimed December 1991, as well as the W3C document entitled A Little History of the World Wide Web. The basic concept of hypertext comes from previous projects from the 1960s, such as the Hypertext Editing System (HES) at Brown University, Ted Nelson Project Xanadu, and Douglas Engelbart's oN-Line System (NLS). Both Nelson and Engelbart are in turn inspired by Vannevar Bush's microdil memoirs, depicted in the 1945 essay "As We May Think".

Berners-Lee's breakthrough is to marry hypertext to the Internet. In his book Weaving The Web, he explains that he has repeatedly suggested that the marriage between the two technologies is possible for members of both the technical community, but when no one takes his invitation, he finally assuming the project itself. In the process, he developed three important technologies:

  • a unique global identifier system for resources on the Web and elsewhere, the universal document identifier (UDI), which came to be known as a uniform search resource (URL) and uniform resource identifier (URI);
  • HyperText Markup Language publishing language (HTML);
  • Hypertext Transfer Protocol (HTTP).

The World Wide Web has a number of differences from other hypertext systems available at the time. The web requires only one-way and non-directional links, allowing one to link to another resource without action by the resource owner. It also significantly reduces the difficulty of implementing web servers and browsers (compared to previous systems), but in turn presents a chronic problem of decaying links . Unlike its predecessors such as HyperCard, the World Wide Web is non-exclusive, making it possible to develop servers and clients independently and add extensions without license restrictions. On April 30, 1993, CERN announced that the World Wide Web would be free to anyone, at no cost. Coming two months after the announcement that the implementation of Gopher protocol server is no longer free to use, this results in a fast transition from Gopher and to the Web. Early popular web browsers are ViolaWWW for Unix and X Windowing System.

Scholars generally agree that the turning point for the World Wide Web began with the introduction of the Mosaic web browser in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA - UIUC), headed by Marc Andreessen. Funding for Mosaic comes from the US. The High Performance Computing and Communications Initiative and the High Performance Computing Act of 1991, one of several computing developments initiated by US Senator Al Gore. Prior to the Mosaic release, unusual graphics were mixed with text on web pages and web popularity was less than the older protocols used over the Internet, such as Gopher and Wide Area Information Server (WAIS). The Mosaic graphical user interface allows the Web to be, by far, the most popular Internet protocol. The World Wide Web Consortium (W3C) was founded by Tim Berners-Lee after he left the European Organization for Nuclear Research (CERN) in October 1994. It was established at the Massachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) with the support of the Research Project Agency Advanced Defense (DARPA), which has pioneered the Internet; A year later, the second site was established at INRIA (French national computer research laboratory) with support from the European Commission DG InfSo; and in 1996, a third continental site was created in Japan at Keio University. By the end of 1994, the total number of websites was still relatively small, but many well-known websites were active that gave the shadow or inspiration of today's most popular services.

Connect with the internet, other websites are created worldwide. This motivates the development of international standards for protocol and formatting. Berners-Lee continues to be involved in guiding the development of web standards, such as markup languages ​​for writing web pages and he advocates his vision of Semantic Web. The World Wide Web allows the dissemination of information through the Internet through a format that is easy to use and flexible. Thus plays an important role in popularizing the use of the Internet. Although both terms are sometimes incorporated into popular use, World Wide Web is not identical to the Internet . The Web is an information space containing hyperlinked documents and other resources, identified by its URI. Implemented as client and server software using Internet protocols such as TCP/IP and HTTP. Berners-Lee was awarded a noble title in 2004 by Queen Elizabeth II for "service for the global development of the Internet".

Maps World Wide Web



Function

The terms Internet and World Wide Web are often used without much difference. However, they are not the same. The Internet is a global system of interconnected computer networks. Instead, the World Wide Web is a collection of documents and other global resources, linked by hyperlinks and URIs. Web resources are usually accessed using HTTP, which is one of many Internet communication protocols.

Viewing web pages on the World Wide Web usually begins either by typing the page URL into a web browser, or by following a hyperlink to that page or resource. The web browser then initiates a series of background communication messages to retrieve and display the requested page. In the 1990s, using a browser to view web pages - and to move from one web page to another through hyperlinks - later known as 'browsing', 'surfing the web' (or surfing the channel), or 'navigating the Web'. The initial study of this new behavior investigates user patterns in using a web browser. One study, for example, found five user patterns: exploratory exploration, window surfing, evolved surfing, limited navigation, and targeted navigation.

The following example shows the functionality of a web browser when accessing a page in the URL http://www.example.org/home.html . The browser resolves the server name from the URL ( www.example.org ) to the Internet Protocol address using the distributed Domain Name (DNS) system globally. This search returns IP addresses like 203.0.113.4 or 2001: db8: 2e :: 7334 . The browser then requests resources by sending an HTTP request on the Internet to the computer at that address. It requests services from certain TCP port numbers that are known for both HTTP services, so that the receiving hosts can differentiate HTTP requests from other network protocols that they may be servicing. The HTTP protocol typically uses port number 80. The contents of an HTTP request can be as simple as two lines of text:

The computer that receives the HTTP request sends it to the web server software that listens to the request on port 80. If the web server can comply with the request, it sends the HTTP response back to the browser indicating its success:

followed by the content of the requested page. HyperText Markup Language (HTML) for basic web pages might look like this:

The web browser breaks down HTML and interprets markup ( & lt; title & gt; /code> for paragraphs, and such) that surround words to format text on the screen. Many web pages use HTML to reference other resource URLs such as images, other embedded media, scripts that affect page behavior, and Cascading Style Sheets that affect page layout. The browser makes additional HTTP requests to the web server for these other types of Internet media. When accepting their content from a web server, the browser is increasingly making the page to the screen as determined by HTML and this additional resource.

Connect

Most web pages contain hyperlinks to other related pages and possibly to downloadable files, source documents, definitions, and other web resources. In the underlying HTML, the hyperlink looks like this: & lt; a href = "http://www.example.org/home.html" & gt; Example.org Homepage & lt;/ a & gt;

A collection of useful, related resources linked through hypertext links is referred to as web information. Publications on the Internet created what Tim Berners-Lee first called WorldWideWeb (in his original CamelCase, which was later discarded) in November 1990.

The WWW hyperlink structure is described by webgraph: the nodes of the web graph relate to web pages (or URLs) that point between them with hyperlinks. Over time, many of the web resources indicated by hyperlinks disappear, relocate, or be replaced with different content. This makes hyperlinks obsolete, a phenomenon called in some circles as link decay, and hyperlinks that are affected by it are often called dead links. The ephemeral nature of the Web has driven many attempts to archive websites. The Internet Archive, active since 1996, is the most famous of these efforts.

Dynamic update of web page

JavaScript is a scripting language originally developed in 1995 by Brendan Eich, then Netscape, for use in web pages. The default version is ECMAScript. To make web pages more interactive, some web applications also use JavaScript techniques such as Ajax (asynchronous JavaScript and XML). A client-side script is delivered with a page that can make additional HTTP requests to the server either in response to user actions such as mouse gestures or clicks, or based on the elapsed time. The server response is used to change the current page rather than creating a new page with each response, so the server only needs to provide limited additional information. Some Ajax requests can be handled at the same time, and users can interact with the page when data is retrieved. Web pages can also regularly poll the server to check if new information is available.

WWW prefix

Many hostnames used for the World Wide Web begin with www because of the old practice of naming Internet hosts according to the services they provide. The hostname of the web server is often www , in the same way as possible ftp for the FTP server, and news or < i> nntp for the USENET news server. These hostnames appear as the Domain Name System (DNS) Name or subdomain name, as in www.example.com . The use of www is not required by technical or policy standards and many websites do not use them; the first web server is nxoc01.cern.ch . According to Paolo Palazzi, who works at CERN along with Tim Berners-Lee, the popular use of www is an accidental subdomain; the World Wide Web project pages are meant to be published at www.cern.ch while info.cern.ch is meant to be the CERN home page, but DNS records are never enabled, and the practice of prepending www into website domain names the institution then copied. Many established websites still use the prefix, or they use other subdomain names like www2 , safe or en for special purposes. Many such web servers are set up so that both the primary domain names (for example, example.com) and the www subdomain (for example, www.example.com) refer to the same site; others need one form or another, or they may map to a different website. The use of subdomain names is useful for balancing incoming web traffic loads by creating CNAME records that point to a group of web servers. Because, currently, only subdomains can be used in CNAME, the same result can not be achieved by using bare root domain.

When a user sends an incomplete domain name to a web browser in his address bar input field, some web browsers automatically try to add the "www" prefix to the beginning and possibly ".com", ".org" and ".net" in the end, depending on what might be lost. For example, entering 'microsoft' can be changed to http://www.microsoft.com/ and 'openoffice' to http://www.openoffice.org . This feature started to appear in early versions of Firefox, while still having the working title 'Firebird' in early 2003, from previous practices in browsers such as Lynx. It was reported that Microsoft granted US patents for the same idea in 2008, but only for mobile devices.

In English, www is usually read as double-u double-u double-u . Some users say it dub-dub-dub , especially in New Zealand. Stephen Fry, in his podcast series "Podgrams", pronounces it wuh wuh wuh . The British writer Douglas Adams once quipped in The Independent on Sunday (1999): "The World Wide Web is the only thing I know about the short form that takes three times longer to say than what is shortened ". In Mandarin, World Wide Web is generally translated by phonological-semantic matching to wÃÆ'n wÃÆ' Â © iw? Ng ( ??? ), which meets www and literally means "a myriad of net dimensions" , a translation that reflects the design and proliferation concepts of the World Wide Web. Web-space Tim Berners-Lee states that World Wide Web is officially spelled as three separate words, each capitalized, with no intervening hyphen. The use of the www prefix has declined, especially when Web 2.0 web applications are trying to brand their domain names and make them easy to pronounce. As the popularity of the mobile web increases, services like Gmail.com, Outlook.com, Myspace.com, Facebook.com, and Twitter.com are most often mentioned without adding "www." (or, indeed, ".com") to the domain.

Decision scheme

The schema scanners http:// and https:// at the beginning of the web URI refer to Hypertext Transfer Protocol or HTTP Secure, respectively. They specify the communication protocol to be used for requests and responses. The HTTP protocol is essential for the operation of the World Wide Web, and an additional encryption layer in HTTPS is essential when browsers send or retrieve confidential data, such as passwords or banking information. Web browsers usually automatically add http://to user-entered URIs, if omitted.

Internet World Wide Web Concept Stock Photo, Picture And Royalty ...
src: previews.123rf.com


Web security

For criminals, the Web has become a place to spread malware and engage in cyber crimes, including identity theft, fraud, espionage and intelligence gathering. Web-based vulnerabilities now outperform traditional computer security problems, and as measured by Google, about one in ten web pages may contain malicious code. Most web-based attacks occur on legitimate websites, and most, as measured by Sophos, are hosted in the United States, China, and Russia. The most common of all malware threats is SQL injection attacks against websites. Through HTML and URI, the Web is vulnerable to attacks such as cross-site scripting (XSS) that comes with JavaScript recognition and is aggravated to some extent by Web 2.0 and Ajax web design that supports the use of scripts. Today with one estimate, 70% of all websites are open to XSS attacks on its users. Phishing is another common threat to the Web. "SA, the EMC Security Division, today announced the findings of the January 2013 Fraud Report, estimating a global loss of phishing at $ 1.5 Billion in 2012". Two well known phishing methods are Covert Redirect and Open Redirect.

The proposed solution varies. Major security firms like McAfee have designed governance and compliance suites to comply with post-9/11 regulations, and some, such as Finjan have recommended an active real-time program code check and all content regardless of source. Some argue that for companies to view Web security as a business opportunity rather than a cost center, while others are calling "everywhere, always-on digital rights management" is imposed on the infrastructure to replace hundreds of companies that secure data and networks. Jonathan Zittrain says, users who share responsibility for computer security are much better than locking the Internet.

World Wide Web stock vector. Illustration of internet - 16322096
src: thumbs.dreamstime.com


Privacy

Each time a client requests a web page, the server can identify the IP address of the request and usually record it. Additionally, unless set not to do so, most web browsers record requested web pages in the history feature visible, and usually store a lot of content locally. Unless server-browser communications use HTTPS encryption, web requests and responses run in plain text on the Internet and can be viewed, recorded, and stored in cache by intermediate systems. When a web page requests, and a user's inventory, identifiable personal information - such as their real name, address, e-mail address, etc.-- Web-based entities can associate current web traffic with that individual. If a website uses an HTTP cookie, user name and password authentication, or other tracking techniques, it may link other web visits, before and after, to the identifiable information provided. In this way it is possible for a web-based organization to develop and build profiles of individuals who use their sites or sites. It may be possible to build notes for an individual that includes information about their recreational activities, their shopping interests, their profession, and other aspects of their demographic profile. These profiles clearly have a potential interest for marketeers, advertisers and others. Subject to the terms and conditions of the website and local laws applying information from this profile may be sold, shared, or forwarded to other organizations without the user being notified. For many ordinary people, this means little more than some unexpected e-mail in their box or some highly irrelevant ads on the web page in the future. For others, it could mean that the time spent indulging in unusual interests can lead to further targeted marketing floods that may be undesirable. Law enforcement, counter-terrorism, and espionage agents can also identify, target and track individuals based on their interests or trends on the Web.

Social networking sites try to get users to use their real names, interests, and locations, not pseudonyms. The leaders of this website believe this makes the social networking experience more appealing to users. On the other hand, uploaded photos or unattended statements can be identified for an individual, who may regret this disclosure. Entrepreneurs, schools, parents, and other relatives may be affected by aspects of social networking profiles, such as text posts or digital photos, that individual posts do not intend for this audience. An on-line interrogator may use personal information to harass or stalk users. Modern social networking websites allow excellent privacy controls for each post, but this can be tricky and not easy to find or use, especially for beginners. Photos and videos posted to websites have caused certain problems, because they can add a person's face to an on-line profile. With modern and potential facial recognition technology, it may be possible to link that face with other previously anonymous images, events, and scenarios that have been imaged elsewhere. Due to image storage, mirroring and copying, it is difficult to remove images from the World Wide Web.

World Wide Web Around Globe Stock Vector - Illustration of south ...
src: thumbs.dreamstime.com


Standard

Many formal standards and other technical and software specifications determine the operation of various aspects of the World Wide Web, the Internet, and the exchange of computer information. Many documents are the work of the World Wide Web Consortium (W3C), headed by Berners-Lee, but some are produced by the Internet Engineering Task Force (IETF) and other organizations.

Usually, when web standards are discussed, the following publications are seen as the basis:

  • Recommendations for markup languages, especially HTML and XHTML, from W3C. It specifies the structure and interpretation of hypertext documents.
  • Recommendations for stylesheets, especially CSS, from W3C.
  • Standard for ECMAScript (usually in JavaScript form), from Ecma International.
  • Recommendations for the Document Object Model, from W3C.

Additional publications provide other important technological definitions for the World Wide Web, including, but not limited to, the following:

  • Uniform Resource Identifier (URI), which is a universal system for resource reference on the Internet, such as hypertext documents and images. URIs, often called URLs, are defined by IETF's RFC 3986/STD 66: Uniform Resource Identifier (URI): Generic Syntax , as well as its predecessor and many RFC schemes define URIs;
  • HyperText Transfer Protocol (HTTP) , especially as defined by RFC 2616: HTTP/1.1 and RFCÃ, 2617: HTTP Authentication , which determines how the browser and server authenticate each other.

The Internet vs. The World Wide Web | Tutorial | CryoDragon ...
src: cryodragon.ca


Accessibility

There are methods for accessing the Web in alternative media and formats to facilitate the use by persons with disabilities. This inability may be visual, auditory, physical, speech-related, cognitive, neurological, or some combination. Accessibility features also help people with temporary disabilities, such as broken arms, or aging users because of their changing abilities. Web receives information and provides information and interacts with the community. The World Wide Web Consortium claims that it is important that the Web be accessible, thus providing equal access and equal opportunity for persons with disabilities. Tim Berners-Lee once said, "The power of the Web is its universality, and access by everyone regardless of disability is an important aspect." Many countries govern web accessibility as a requirement for websites. International cooperation in the W3C Web Accessibility Initiative leads to a simple guide that web content writers and software developers can use to make the Web accessible to people who may or may not be using assistive technology.

World wide web internet concept â€
src: st.depositphotos.com


Internationalization

Internationalization Activities The W3C ensures that web technologies work in all languages, scripts, and cultures. Starting in 2004 or 2005, Unicode gained ground and finally in December 2007 surpassed ASCII and Western Europe as the most commonly used Web character encoding. Initially RFC 3986 allows resources identified by URIs in a subset of AS-ASCII. RFC 3987 allows more characters - every character in the Universal Character Set - and now resources can be identified by IRI in any language.

The World Wide Web Turns 25: A Timeline - TechSpot
src: static.techspot.com


Statistics

Between 2005 and 2010, the number of web users doubled, and is expected to exceed two billion by 2010. Early studies in 1998 and 1999 estimated Web sizes using the capture/recapture method suggest that most webs are not indexed by remote machine and Web searches bigger than expected. According to a 2001 study, there are a large number, over 550 billion documents on the Web, mostly on the invisible Web, or Deep Web. A 2002 survey of 2,024 million web pages determined that by far the most web content in English: 56.4%; The next page is in German (7.7%), French (5.6%), and Japanese (4.9%). The newer study, which uses web searches in 75 different languages ​​to sample Web samples, determined that there were more than 11.5 billion web pages on the web that could be indexed publicly by the end of January 2005. As of March 2009, indexed webs contained in at least 25.21 billion pages. On July 25, 2008, Google software designers Jesse Alpert and Nissan Hajaj announced that Google Search has found a trillion unique URLs. As of May 2009, more than 109.5 million domains are operated. Of these, 74% are commercial or other domains operating in the generic top-level domains com . Statistics that measure the popularity of a website, such as the Alexa Internet rankings, are usually based on the number of page views or on the corresponding server "clicks" (file requests) it receives.

World Wide Web to Nation Wide Web: has China's development hit the ...
src: cdn2.i-scmp.com


Web caching

Source of the article : Wikipedia

Comments
0 Comments