Welcome!

Machine Learning Authors: Yeshim Deniz, Liz McMillan, Pat Romanski, Elizabeth White, Zakia Bouachraoui

Related Topics: Machine Learning , Open Source Cloud

Machine Learning : Article

AJAX and the Maturation of Web Development

From "View Source" to "Open Source"

I have always imagined the information space as something to which everyone has immediate and intuitive access, and not just to browse but to create." (Weaving the World Wide Web by Tim Berners-Lee)

From the beginning, the World Wide Web that Tim Berners-Lee imagined was a place where the architecture of participation ruled. Berners-Lee's first application for accessing the information Web was both a browser and an editor, and throughout the early 1990s he worked diligently to encourage Web browser development groups to develop editors and servers as well as browsers. As early as the spring of 1992, the challenge was clear: "Although browsers were starting to spread, no one working on them tried to include writing and editing functions....As soon as developers got their client working as a browser and released it to the world, very few bothered to continue to develop it as an editor" (Weaving the World Wide Web by Tim Berners-Lee).

Developers tended to defer the editing functionality for a number of reasons, mostly to compress development schedules and to get the browser out the door - because they felt many people, if they didn't need it, would at least use it - without the editor, which was more complex and useful to a smaller audience. Netscape Communicator 4.0, released in 1997, did finally include Netscape Composer, although its licensing terms only allowed free use for non-commercial purposes. Internet Explorer never contained an editor directly, though Microsoft acquired FrontPage from Vermeer in 1996; FrontPage 1.0 had been released in 1995 (www.seoconsultants.com/frontpage/history/).

It wasn't just the complexity of the editing functions themselves, of course, but also the fact that reading pages required a much simpler authorization model, in which the user either has access to the document or does not. In fact, the cluster of issues at hand - from version control of Web pages to multiple authors editing the same page, sometimes at the same time, to control over who should have access to change what pages - would busy the content management industry for the better part of the next decade.

While the early Web browser teams deferred creation of an HTML editor, they retained a key element of Sir Berners-Lee's original Web browser/editor:

The 'View Source' menu item migrated from Tim Berners-Lee's original browser, to Mosaic, and then on to Netscape Navigator and even Microsoft's Internet Explorer. Though no one thinks of HTML as an open source technology, its openness was absolutely key to the explosive spread of the Web. Barriers to entry for "amateurs" were low, because anyone could look "over the shoulder" of anyone else producing a Web page ("The Architecture of Participation" by Tim O'Reilly).

This "View Source" menu item, which was not buried in developer editions or professional versions but was part of the core browser, created a culture of easy access to knowledge.

As the complexity of presentation-tier Web development grew, with Cascading Style Sheets, JavaScript, and DHTML in the mix, the View Source culture of Web development evolved into an open source culture of frameworks and libraries. It is this culture that enables the viability of current AJAX-based approaches to Web development.

The View Source Culture
From a narrow perspective, the decision to include a View Source option in the Web browser was an insignificant choice, perhaps useful for troubleshooting formatting issues, but of interest to a very small community. As Berners-Lee puts it, "I never intended HTML source code (the stuff with the angle brackets) to be seen by users. A browser/editor would let a user simply view or edit the language of a page of hypertext, as if he were using a word processor. The idea of asking people to write the angle brackets by hand was to me, and I assume to many, as unacceptable as asking one to prepare a Microsoft Word document by writing out its binary coded format" (Weaving the World Wide Web by Tim Berners-Lee). What View Source did (and still does!) was let users who were interested in learning to create Web pages see what HTML source was delivered to the browser to produce the page currently being rendered. Perhaps because many of the early Web users were developers of one kind or another, it became an expectation that any reasonable browser would include the ability to View Source.

Viewed more broadly, however, the View Source command was nothing short of revolutionary. It set the expectation that users should be able to not only view the "rendered" document, but also the "code" that created it. Because early browsers often differed in their interpretation of HTML, this was critical. Significantly, though, the View Source option was not buried in a developer's edition but was part of the edition everyone used, which encouraged even neophyte users to view the source of pages, whereupon they would see the relative simplicity of (especially early) HTML. (An interesting discussion about the need for the View Source option can be found in this bug report: https://bugzilla.mozilla.org/show_bug.cgi?id=256213 - which was a request to move View Source into a developer build of Firefox, and was ultimately rejected.)

This same expectation - that users should be able to view the raw source of files served by Web servers in addition to the rendered effect - was later extended to Cascading Style Sheets (.css files) and JavaScript (.js). In order to be rendered and displayed, browsers would need to download HTML, CSS, and JavaScript files, along with images and other binary files referenced in pages. However, the fact that the major browser developers chose to expose access to raw source as a first-level menu item was extraordinary. (In some cases accessing .css and .js files required a bit more ingenuity on the user's part, but nothing like the difficulty of accessing the source files in any other format such as Microsoft's Word or Adobe's PDF.)


More Stories By John Eckman

John Eckman is Senior Director of Optaros Labs. and has over a decade of experience designing and building web applications for organizations ranging from small non-profit organizations to Fortune 500 enterprises.

John blogs at OpenParenthesis and you can find him on Twitter, Facebook, and many other social networks: all his online personas come together at JohnEckman.com.

His expertise includes user experience design, presentation-tier development, and software engineering. Prior to Optaros, he was Director of Development at PixelMEDIA, a web design and development firm in Portsmouth NH, focused on e-commerce, content management, and intranet applications. As Director of Development, he was responsible for managing the application development, creative services, project management, web development, and maintenance teams, as well as providing strategic leadership to teams on key client accounts, including Teradyne, I-Logix, and LogicaCMG.

Previously, John was a Principal Consultant with Molecular, a Watertown MA-based professional services and technology consulting firm. In this role he was responsible for leading technical and user experience teams for clients including JPMorgan Invest, Brown|Co, Knights of Columbus Insurance, and BlueCross and BlueShield of Massachusetts. Before becoming a Principal Consultant, he served in a number of other roles at Tvisions / Molecular, including various project lead roles as well as User Experience Manager and Director of Production.

John's technical background includes J2EE and .NET frameworks as well as scripting languages and presentation-tier development approaches, in addition to information architecture, usability testing, and project management. He received a BA from Boston University and a PhD from the University of Washington, Seattle; he completed an MS in Information Systems from Northeastern University in 2007. He lives with his wife and two cavalier spaniels in Newburyport, MA.

Contact John Eckman

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
AJAXWorld News Desk 11/30/06 01:56:02 AM EST

From the beginning, the World Wide Web that Tim Berners-Lee imagined was a place where the architecture of participation ruled. Berners-Lee's first application for accessing the information Web was both a browser and an editor, and throughout the early 1990s he worked diligently to encourage Web browser development groups to develop editors and servers as well as browsers. As early as the spring of 1992, the challenge was clear: 'Although browsers were starting to spread, no one working on them tried to include writing and editing functions....As soon as developers got their client working as a browser and released it to the world, very few bothered to continue to develop it as an editor' (Weaving the World Wide Web by Tim Berners-Lee).

CloudEXPO Stories
David Friend is the co-founder and CEO of Wasabi, the hot cloud storage company that delivers fast, low-cost, and reliable cloud storage. Prior to Wasabi, David co-founded Carbonite, one of the world's leading cloud backup companies. A successful tech entrepreneur for more than 30 years, David got his start at ARP Instruments, a manufacturer of synthesizers for rock bands, where he worked with leading musicians of the day like Stevie Wonder, Pete Townsend of The Who, and Led Zeppelin. David has also co-founded five other companies including Computer Pictures Corporation - an early player in computer graphics, Pilot Software - a company that pioneered multidimensional databases for crunching large amounts of customer data for major retail companies, Faxnet - which became the world's largest provider of fax-to-email services, as well as Sonexis - a VoIP conferencing company.
When applications are hosted on servers, they produce immense quantities of logging data. Quality engineers should verify that apps are producing log data that is existent, correct, consumable, and complete. Otherwise, apps in production are not easily monitored, have issues that are difficult to detect, and cannot be corrected quickly. Tom Chavez presents the four steps that quality engineers should include in every test plan for apps that produce log output or other machine data. Learn the steps so your team's apps not only function but also can be monitored and understood from their machine data when running in production.
With the mainstreaming of IoT, connected devices, and sensors, data is being generated at a phenomenal rate, particularly at the edge of the network. IDC's FutureScape for IoT report found that by 2019, 40% of IoT data will be stored, processed, analyzed and acted upon at the edge of the network where it is created. Why at the edge? Turns out that sensor data, in most cases, is perishable. Its value is realized within a narrow window after its creation. Further, analytics at the edge provides other benefits.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by FTC, CUI/DFARS, EU-GDPR and the underlying National Cybersecurity Framework suggest the need for a ground-up re-thinking of security strategies and compliance actions. This session offers actionable advice based on case studies to demonstrate the impact of security and privacy attributes for the cloud-backed IoT and AI ecosystem.
Enterprises that want to take advantage of the Digital Economy are faced with the challenge of addressing the demands of multi speed IT and omni channel enablement. They are often burdened with applications that are complex, brittle monoliths. This is usually coupled with the need to remediate an existing services layer that is not well constructed with inadequate governance and management. These enterprises need to face tremendous disruption as they get re-defined and re-invented to meet the demands of the Digital Economy. The use of a microservices approach exposed through APIs can be the solution these enterprises need to enable them to meet the increased business demands to quickly add new functionality.