My Contents

Monday, November 26, 2007

Oracle Communications Introduces Unified Inventory Management

Oracle (News - Alert) Communication recently introduced the Oracle Communications Unified Inventory Management (UIM), a new standards-based inventory management application that provides communications service providers, carirers and network operators with a real-time, unified view of customer, service and resource inventory. This new application will enable service providers to rapidly introduce and deliver customer-centric, next-generation services; improve the financial control and compliance related to their infrastructures; and deploy and manage network resources in an optimized, cost-effective manner.

David Sharpley, Vice President, Product Marketing and Channels, Oracle, says, “Our new Unified Inventory Management [UIM] product has been on our development drawing board for well over two years, and I’d say closer to three years. This inventory system and how we’re approaching the market is different on a couple of fronts. First, it was architected from the ground up using an open standards-based architecture. So it is the first inventory product to be architected using what’s called the Shared Information/Data [SID] model from the Telemanagement Forum [www.tmforum.com]. That Shared Information/Data model is a common representation of key objects and their attributes, so things such as customers, service and so forth can then be leveraged across the organization as well as be integrated. SID is standards-based.”

“Also, consistent with Oracle’s strategy in the development of our applications suite,” says Sharpley, “it is very open. And when we say open, what we mean is this whole thing is exposed and is architected from the ground up using a Services-Oriented Architecture (SoA), which can then enable the use of that data through a variety of different systems. That’s just the first thing as to how the product was architected.”

“Secondly, there’s the approach in terms of the data management,” says Sharpley, “which is also unique. Many of our competitors started out by saying they had to migrate all of the data into their system, and then they can leverage that into the inventory management systems. The Oracle approach is different in that we want you, the service provider, to be able to leverage the data that may exist in a variety of different systems in your network today, yet use the unified inventory management and ‘unify’ that data as well as aggregate and enhance it, to provide a single view of the customer, or a single view of the service, and allow for things such as rapid service bundling. So, instead of recreating and/or migrating data from one legacy system into the new platform, it’s really about wrapping and copying the existing data that sits there and exposing it in a contemporary, architecturally-friendly way.”

“You can look at it as a kind of funnel,” says Sharpley, “A lot of the other approaches are something like this: in order to leverage that data, you need to migrate data from your legacy system into the new system. And we ask, ‘Why would you want to migrate data on broadband, for example? The broadband system is working fine and it’s not going anywhere. What’s the business case of taking everyone’s broadband network subscriber and service data and putting that into a new system?’ It’s a major cost. The business case is likely not there. What you need to do is to take that broadband information, allow that to be exposed through our UIM, which then leverages that as you go and build quad-play bundles and so forth, that utilizes that broadband information.”

“The third thing which makes our Unified Inventory Management system unique,” says Sharpley, “is that we approach it not just from a network, physical perspective, but also from a logical view, taking in such things that are increasingly important, such as IP addresses, SIM cards, telephone numbers and so forth. Not just the physical equipment, but the logical as well as the customer and the service data. And the reason why that’s important is that it’s really about how you manage the customer and the services. From an Oracle perspective, what makes us unique is that we can then leverage that information with other systems within Oracle. For example, integration with Customer Relationship Management [CRM] to have a detailed view on the customer and what services are required to substantiate that customer service through the integration of true UIM.”

“Very similarly, we can take our physical network information and then integrate that as part of our Enterprise Resource Planning [ERP] and management, so that you can do things such as asset tracking, procurement, and financial management of all the assets that sit in the network, which is important because you’ve got CFOs that now need to certify processes in place. The integration of our inventory with our ERP is really unique because we’re the only ones in the marketplace that can really tie in the entire network lifecycle from planning through to procurement, through asset management and the use of equipment and facilities in the network, and then even the financial depreciation of them.”

“To summarize,” says Sharpley, “our new Unified Inventory Management system for service providers is open and standards-based. It looks at the different layers in terms of a functional view of inventory. And clearly the key differentiator for us at Oracle is the ability to integrate both our CRM and ERP capabilities to provide operators with a complete lifecycle.”

Rejeev Tankha, Director of Product and Solutions Marketing at Oracle, says, “The reason we call it ‘unified’ is that first, most inventory, as you probably are aware, has been focused purely on the physical aspects of the network. Take the stories of how inventories have been traditionally approached by network operators. They consolidate all the data and every time there’s a new technology coming in, everyone says, ‘This is the last system we need to install’. But it isn’t and in any case you end up with various silos. If you look at large operators such as BT, AT&T (News - Alert), Telstra or China Telecom, they have literally hundreds of silos and databases of different inventories. The standard parts services have their own inventories, and they’re not going to change that, whether you like it or not. As vendors of software, most say, ‘let’s consolidate,’ so our competitors, such as the big names in the inventory business, have all taken the approach of wanting to consolidate everything into a single mega-inventory. It never actually works, of course, even though millions of dollars has been spent on the idea. That’s one of the key challenges folks are facing; as the requirements have shifted away from a connection-based world to a connectionless world, such as managing an IP infrastructure.”

“As a product is architected and designed to manage a physical network,” says Tankha, “it becomes extremely difficult to be able to manage an IP infrastructure, where the rules of information are different. That’s what David was talking about, a common information model. We’ve approached these operators and we’ve said, ‘Look, we’re talking about unification at three levels. First we unify services across the network and give providers a customer-centric view of services. They want us to map customers to particular services and deliver those services in the future. When carriers are creating commercial services in the CRM field, and they talk about providing 360-degree views of the customer – most traditional inventory systems actually give you views of the network and not the service. How is that pretty good service associated with the customer? What are the resources allocated to that service? That’s one way of looking at it. And that’s where most people are struggling today. Operators are launching IPTV services, and more and more consumer-based items. So that’s one key problem that most carriers face.”

“The second part is this concept of consolidation and integration,” says Tankha, “What we’re saying should be taken as a guide. We understand that there are systems which may be old, but they do continue to work. Trying to change-over a parts inventory system is not going to work. So, our objective is to create a common information model; to be able to pull in data from any source that may exist and to make it available to any user. That user could be in network engineering, or the user could be somebody in the marketing group who needs to have a customer-centric view. Or it could be a CFO for managing and tracking the supply-chain and financials, and so forth. These people need to have the same information anywhere whenever they need it. Our objective is to bring in the data from wherever it is, have a trusted source of information, and then make that available to whoever wants it wherever they are. That’s the key thing.”

“The third part is, in adding on all of this stuff, we need to be able to bring to bear additional capabilities such as intelligence and analytics, so you can do forecasting and planning for the future,” says Tankha. “We are able to bring all of this together into one solution, and that’s where we are different from other solutions. Most carriers today are asking not for consolidation, but for integration. We have the answer for them. Carriers are saying ‘I want to have service views of the customer, rather than network views’. We are able to do that. And the third item is the integration of processes – and Inventory fits right bang in the center of these things. Those processes could be deploying a network, or doing full-asset tracking in management, in the full life cycle from deploying the network to upgrading it to maintaining it, to retiring a particular asset, and that information needs to be made available across the whole network. We do that. We can deal with service provisioning. And of course the final area is financial control and compliance with such things as the Sarbanes-Oxley Act or the European Union directives. That’s not a new concept – but the point is that nobody’s been able to do all of it. It’s a frontier between inventory and financial systems.”

“But by being able to define the integration point and the various processes for financial compliance,” says Tankha, “we’re able to deliver a single-source solution to customers, that gives you a single view over everything. That what we mean by ‘unification.’”

So, it appears that Oracle’s next-gen Unified Inventory Management solution is indeed different from SAP (News - Alert) or other competitors thanks to its comprehensive network lifecycle management and compliance capabilities.

Software Review: Miro 1.0

As a new and developing tech market, internet TV has a varied and fairly short history. First, there was streaming video, and the quality of what you could watch depended on your connection speed. As people moved into the world of broadband internet, the quality and ability to access video online increased as well.

Then came YouTube, which popularized the idea of social interaction with video, and finally showed that streaming video was here to stay. But many of us got tired of watching video on small little pop up windows, and the idea of having high definition video that utilized fast broadband speeds to stream content over the internet came about. Now, products like AppleTV and Joost bring the best elements of on-demand television and streaming video together, creating a whole new field of contenders in the internet TV market.

While Joost remains one of the biggest players in the internet TV market, it's certainly not the only one. Miro, an open source program devoted to high quality internet TV, is one software platform that hopes to take a large piece of the internet TV pie, and Miro is doing so through a completely open and free platform that is not controlled by content providers or advertising revenue. Earlier this month, Miro finally hit the big time by releasing Miro 1.0 and added many improvements to the interface and program itself.

What this means for consumers is that, with Miro, you get a platform that is open to anyone who packages their original videos into a standard RSS feed, much in the same way you would access blogs or podcasts on a Web site. Additionally, Miro's open source credentials mean that anyone technically minded can add to the program new plug-ins and services just like Mozilla's Firefox browser. It's like YouTube for the technically minded, offering much more flexibility than the competition.


The Miro 1.0 welcome screen.

The first noticeable difference between Miro and Joost is that Miro has an interface more in line with iTunes than a media center product, giving you an easier way to navigate programs before you watch. Miro also claims to have over 2,000 channels, where Joost has around 300 channels. But let's be clear about something: the quality of Miro's "channels" varies from major media broadcasts (NBC, Adult Swim, etc.) to some guy with a web cam, so it's often more difficult to find good entertainment. However, Miro has a larger collection of "news" channels, while Joost only has a few of the major ones like CNN and CBS.

While Miro's content quality varies, it does have nearly the same stuff that Joost offers, except commercial free. And one advantage with Miro 1.0 (or disadvantage, if you have a fast connection) is that programs are downloaded automatically to your computer instead of streaming over the internet, giving more access to those with slower computers.



Downloading a program on Miro 1.0.
Miro 1.0 also takes up fewer system resources than either Veoh and Joost. For Windows, Miro requires 128 MB of RAM and Direct X 3.0 or higher running on Vista, XP or 2000 (it also has unofficial support for Windows 95 and 98). For Macintosh, Miro requires at least OS X 10.3 and Quicktime 7 to run. And of course, there is a Linux version with official packages for Fedora, Ubuntu, Gentoo, and Debian distributions of Linux.
He tested Miro running Windows Vista and had very few problems getting started with Miro 1.0; in fact, it ran much smoother than similar tests using Joost and Veoh. The graphical interface is easy to use and had a familiarity about it while still using cutting edge technology. Within minutes, I had added all of my favorite channels, and the latest episodes started downloading to my computer immediately. To save hard drive space, Miro only keeps a copy of every show you download for five days with the option to delete or permanently keep the show. If you choose to keep the show, the file is saved in a folder, and you can upload it to any device since Miro does not use DRM to lock shows to the software platform.

Miro 1.0's provided channels are easy to use
Even though Miro 1.0 provides an easy to use software platform, it's not perfect. Some of the shows, even ones promised as High Definition quality, were choppy and had digital noise problems in full screen mode. I also found that the search bar, a feature that connects to the major video providers like YouTube and Revver, did not find many common videos on these sites. And some of the videos that it found were not full videos; many were just clips of the original content, forcing users to go to the content providers Web site to view the rest of the video.
Overall, Miro 1.0 is certainly a step forward for open source video technology, and is a contender in the lucrative internet TV market. I found Miro's easy to use interface and content quality to be better than Veoh, but not quite as good as Joost. Even though Miro promises more content and open standards, Joost still has better content with a guarantee of quality. But Miro is in a great position to surpass Joost, and with newer versions, Miro may just take the number one spot as the place to be for internet TV. To download Miro 1.0 for free, go to getmiro.com.

Microsoft working to close 8-year-old Web proxy vulnerability

Microsoft is working on a fix for an eight-year-old flaw in Windows that lets hackers exploit a Web proxy autoconfiguration protocol and take over groups of machines via a single attack. Microsoft has yet to release the update it has been working on since last week that addresses the vulnerability in the Web Proxy Autodiscovery Protocol (WPAD) in Windows.

MICROSOFT SUBNETNetwork World presents the independent voice of Microsoft customers.
10 funniest Microsoft videos
Microsoft vs. open sourceThe battle rages
Microsoft security advisories
Mitchell Ashley on Microsoft
Microsoft Subnet blog: insights and opinion
Book giveaway Training giveaway

MORE MICROSOFT NEWS
The Top 15 most colorful Microsoft statements ever
Microsoft’s new multicore-computing guru speaks out11/19/07
Microsoft hires supercomputing guru11/12/07
Nigeria favors Mandriva over Microsoft once more11/09/07
All
Community
Musings and observations of a first year Cisco CCIE!
Microsoft security "process" trumps Open Source "many eyes"Web-based bomb threats net indictment

The flaw was first discovered in 1999, and some experts say it has never be adequately patched

Read the latest WhitePaper - 2007 SMB Automated Data Protection Guide

The flaw affects all versions of Windows including Vista, but does not affect computers in the United States. Microsoft reportedly patched the flaw eight years ago to protect computers that use the “.com” domain as part of their corporate identity. The fix, however, does not work for computers that use domain country codes, such as .nz (New Zealand) or .uk (United Kingdom).
WPAD is a method used by Web browsers to locate a proxy configuration file called wpad.dat that is used to configure a Web browser’s proxy settings. Part of the flaw lets the search for the configuration file leave the safety of the corporate network, thus opening an avenue for a hacker to hijack the request and deliver a configuration file to the browser that could then be then exploited to intercept and modify the user’s Web traffic.

The Windows WPAD feature was designed so administrators would not have to configure browser proxy settings on each desktop manually. All the automated WPAD configuration work takes place out of view of the user. Last week, Beau Butler, who also goes by the name Oddy and the title “ethical hacker,” presented his rediscovery of the WPAD flaw at the annual Kiwicon security conference at Victoria University of Wellington in New Zealand. Butler told conference attendees and Australia’s The Age Web site that he found 160,000 computers in New Zealand using the .nz domain that were vulnerable to the WPAD flaw. The Age said Microsoft asked it not to publish the details over fears they could be used by cybercriminals to seize control of workstations. Microsoft confirmed it was a serious issue, The Age said.

Some details about the vulnerability, however, are available by doing simple queries on Microsoft’s own Live Search Web site. Additionally, Microsoft publishes Knowledge Base articles on its Web site describing how WPAD works.

The Kiwicon conference’s session abstracts said Oddy (Butler) would be “explaining all the ways in which networks can be configured in order to make WPAD leakage a non-problem.” According to the Microsoft Web site “WPAD lets services locate an available proxy server by querying a DHCP [dynamic host configuration protocol] option or by locating a particular DNS [domain name system] record.”

Web caching expert Duane Wessels, who helped develop Squid, a high-performance proxy-caching server, has a Web sitethat explains the flaw to users. “Basically, it works like this: When the browser starts, it issues a DNS address lookup for ‘wpad.foo.com’ where ‘foo.com’ is the domain name of the computer. Due to some Microsoft bug, some browsers look up ‘wpad.com,’ which is my host,” he wrote on his Web site. Wessels made the post after getting angry letters for network administrators who saw traffic funneling to his Web site.

In actuality, the DNS lookup only happens if DHCP does not turn up the wpad.dat file. DNS is the next option, and the lookup for “wpad.com” happens as WPAD crawls down the network DNS hierarchy searching for the address of the wpad.dat proxy configuration file. WPAD typically can guess rather accurately that it is searching on a company’s own internal network, but the country code domains derail the process and let the search go off the corporate network.
Regardless of where the search progresses, once the location of the wpad.dat file succeeds, the browser makes a connection and gets the file and configures the browser. If a hacker succeeds in getting his own wpad.dat file to configure the browser, the attacker can point the browser to his proxies and intercept and modify all the browser’s HTTP traffic. Although the available data on WPAD’s inner workings points to how it can be exploited, Butler has not discussed his specific exploit except during his Kiwicon presentation last week.

Microsoft was contacted for input on the issue, but had not responded by the time this article was posted.

Saturday, November 24, 2007

Sasser is fastest written Windows worm

The "Sasser" computer worm now plaguing computers around the world was based on a critical software flaw revealed by Microsoft just 17 days before the worm's release.
Microsoft revealed a total of 20 software bugs in a bulletin issued on 13 April and the first version of Sasser appeared on 30 April. Over the next few days this and three variants - tweaked to improve the speed of infection - succeeded in infecting many hundreds of thousands of computers worldwide.
Previously, the Blaster worm held the record for the fastest written Windows worm. It was unleashed on 11 August 2003, using a vulnerability revealed 25 days before it started to spread itself.
Yet, despite the shrinking gap between the disclosure of a bug and the appearance of a worm or virus, experts say trying keeping flaws secret would be more dangerous. A worm could cause far more damage if it were based on a vulnerability that was not widely known about, they say, as very few people would have a patch in place.
"There's a false notion that secrecy equals security," says computer security expert Bruce Schneier. "What you end up with is very fragile security - as soon as you lose your secrecy you're insecure."
Made public
Many computer worms, viruses and hacking tools exploit bugs that are openly disclosed by software companies.
Stuart Okin, chief security advisor for Microsoft UK, says flaws are often discovered by researchers outside of Microsoft. "We always work on the assumption that, if it is externally found, it will become public," he told New Scientist.
Microsoft says customers should apply software patches quickly and use firewall and anti-virus software to keep their systems secure.
But Schneier believes this may disguise the main issue. "I believe the real problem is that software quality sucks," he told New Scientist. Schneier suggests that software companies would improve the quality of their code if they were held legally liable for any damage resulting from bugs.
Okin points out that Microsoft that is working to improve the security of its code through a programme that began three years ago.
Forced restart
The main impact of Sasser and its variants is to cause infected machines to restart when a user attempts to access the internet. The worms infect computers across a network by exploiting a bug in a component of Microsoft's Windows XP and Windows 2000 operating systems.
An infected computer scans local network connections and randomly generated IP (internet protocol) addresses to find fresh systems to infect. Once a vulnerable computer is discovered, the worm breaks in and then installs an FTP (file transfer protocol) server. This allows it to transport a copy of itself to the new machine.
As Microsoft's products dominate the global market for computer operating systems, worms targeting Windows spread further and cause more damage

The Value of the PSP

By J.M. Smith on November 22nd, 2007

Early in its life cycle, the price of a PlayStation Portable was often a point of ridicule for those who supported Nintendo’s DS. The latter system’s $130 price tag has given it a nearly untouchable status in the handheld market. Certainly, this lower expense was the main reason that I myself initially shied away from purchasing Sony’s entry into the handheld market. Once I finally got my hands on a PSP, albeit as a birthday gift, I began to realize just how much functionality comes out of such a diminutive device.

The most important function of a PSP, obviously, is its capability as a gaming system. During the initial release window, Sony and other companies saw fit to simply translate boatloads of PlayStation 2 games into portable packages. Some saw this as a problem: why pay $40 for a game you’ve already played many iterations of in the comfort of your home? Others embraced the idea wholeheartedly, content to revisit old favorites while developers got their feet wet. The ability to take a console-level production on the road was more than enough for these sorts of people. In comparison with the DS, this massive computational power is the PSP’s big draw. The system also has the advantage of being able to accurately mimic analog control with its tiny black nub. This may not seem like much, but it offers much more precise movement in a 3D environment than does the use of a stylus or a directional pad. The familiarity of the button placement and control schemes has likely drawn in many a PS1 and PS2 veteran as well.

Until they had begun to fade from store shelves, and despite low sales, UMD movies made great use of the PSP’s clear, bright screen and clean audio sampling. The format was initially popular with movie studios as a quick way to cash in on both recent and previous releases. Most system owners, on the other hand, were not so ready to jump on the bandwagon, as they were justifiably repulsed by the often outrageous prices of the miniature DVDs. Often, a film on UMD lacking many extras would cost as much or more than its two-disc special edition DVD counterpart. The failure of this convenient format is a prime example of a solid idea brought down by an innate corporate desire to make as much money as possible. Even if a high cost of disc production is assumed, Sony and others probably could have found a solution or come to middle ground on the pricing issue.

Because UMDs were generally expensive, resourceful PSP owners utilized the Memory Stick slot and Sony’s own Media Manager or a freeware third-party tool called PSP Video to convert and watch movies on the go. While the company’s proprietary removable media format was usually priced higher than the SD or CompactFlash competition, inserting one into a PlayStation Portable turned the machine into a multimedia machine comparable to an iPod, though with much less storage space and shorter battery life than Apple’s current models. The system’s external speakers, while nothing to write home about, are a sorely missing component on most MP3 players. PSP can play movies, music, and photo slideshows just like the competition, of course in addition to its application as a gaming device.

The system’s Wi-Fi capabilities bolster the experience by providing users with a way to play many games online. While the DS has this feature, it took nearly a year for any games to be released with support for it; the PSP had a few online titles at launch. The ability to download system updates is an added bonus, although any game that requires a system update will include it on the disc. The web browser, while not perfect by any stretch of the imagination, is a perk for those who want to hop on the internet at a hotspot to check e-mail or read RSS feeds. One of the most recently added features is the ability to network with a PlayStation 3 from any Wi-Fi access point, enabling the user to view pictures and listen to music stored on the system’s hard drive.

The core system is now $30 cheaper than it was originally and now includes the capability of outputting video to a TV. For the same price as the first model, which included nothing but the PSP itself, you can purchase this lighter, slimmer model with a game, a 1GB Memory Stick, and a UMD included in the package. Nowadays, the cost of a Memory Stick is perhaps a little bit more than most flash memory, but prices are well within reason. Additionally, cards with storage up to 8GB are available. The internet browser is much better at formatting web pages than it was in the past. Developers are beginning to discover the full potential of the machine, pushing graphics comparable to some of the better-looking PS2 titles. The system’s library is constantly growing with new games in every genre.

All of the features mentioned above, as well as a few more, can be found in the PlayStation Portable. Yet it remains a point of contention with potential buyers that the system is more expensive than Nintendo’s handheld offering. The DS certainly has its strong points, including a massive catalog of well-designed and unique games, but it fails as a convergent multimedia device unless outfitted with an expensive third-party storage peripheral, which are designed for those familiar with computer terminology and structure. This is not ideal for those who want a plug-and-play experience. The PSP is of great value to a gamer who yearns for more in a portable device than the ability to run video games. Perhaps the price isn’t so bad after all.

Wednesday, November 21, 2007

GSM-CDMA slugfest on despite daylong meet

Nivedita Mookerji
Thursday, November 22, 2007 05:06 IST

http://www.dnaindia.com/report.asp?newsid=1134692

NEW DELHI: The tussle between the two mobile groups - GSM and CDMA - continues despite daylong deliberations between the government and the industry on Wednesday.

After meetings spread out over four hours with top officials of the department of telecommunications (DoT) on issues related to spectrum allocation and dual technology, mobile operators said they were “in dialogue” with the government, indicating no consensus had emerged. Company sources said only preliminary discussions were held on some of the contentious issues.

While the DoT secretary DS Mathur and other senior officials met the top representatives of all private telecom companies in the morning, there were some one-on-ones between the government and the industry later in the day. Media was not allowed inside Sanchar Bhavan, which houses DoT, all through Wednesday.

Reliance Communications’s Anil Ambani, Bharti’s Sunil Mittal, Idea Cellular’s Sanjeev Aga and Vodafone Essar’s Asim Ghosh were among those who attended the DoT meetings on Wednesday. None of them was ready to make any comment on the day-long deliberations.

Mittal said, “We have been asked not to speak to the media.”

The objective behind the DoT-industry meet was to resolve some of the differences between two mobile phone groups. GSM members had recently moved the telecom dispute tribunal (TDSAT) against the new government policy allowing use of dual technology and the revised spectrum allocation criteria.

MS Word in Internet browser environment!

Mumbai: ‘Hotmail’ founder Sabeer Bhatia-promoted firm, Install Collaboration Software Technologies Pvt Ltd (InstaColl), has launched a software that allows office suite applications such as Microsoft Word to function in an Internet browser environment.

Christened ‘Live Documents’, the product could ‘pose a significant challenge to a significant portion of Microsoft’s revenues from the office suite’, because of it encompassing all features of Microsoft Office 2007, Sabeer Bhatia, Chairman, InstaColl, said. “The software allows changes made on the document in the browser environment to be synchronised with the document on the desktop in real-time, ensuring that the desktop and web versions are always in sync,” he told presspersons here on Wednesday.

This also means that any change made by one user of a multi-user document will be instantly updated for all users and can be accessed from anywhere in the world.

For individual users the product is distributed free of cost and can be downloaded from the company web site. “For corporate clients, we may look at different pricing models. We may charge something like $50 a year, or $10 for a month,” added Bhatia.

Sunday, November 18, 2007

The Value Imperative — Part 3: Charting and Technical Analysis


As a child did you lie on your back in the grass looking up at the clouds? Did you see lions, houses, faces and other shapes? If so, perhaps you would be good at charting.

In Part 3 of our series of articles on finding value in the stock market we consider charting and technical analysis. Charting consists of the careful study of past price movements and perhaps past trading volumes in order to make forecast about future prices. I had a friend who would buy large rolls of charting paper and scroll them across his work bench. He would meticulously mark price movements and then overlay them with triangles of various shapes and sizes. His aim was to detect hidden triangular patterns because he believed that they would be repeated allowing him to forecast future movements of the price.

I doubt many people do it these days in such a laborious way. Eyeballing yards and yards of chart paper has been replaced by sophisticated computer packages sniffing out past patterns. This is called technical analysis.

In the first part of the article I want to sketch the ideas of three of the most influential chartists, Charles Dow, Ralph Elliott and William Gann.

Dow Theory

When charting, perhaps people don't search for lions and elephants, but they do look for other interesting shapes with suggestive names such as double bottoms and head and shoulders. The original school of charting was based on 255 Wall Street Journal articles written by Charles H. Dow (1851-1902), the founder and first editor of the Wall Street Journal and co-founder of Dow Jones and Company with Edward Jones and Charles Bergstresser.

After his death William P. Hamilton, Dow's understudy and the fourth editor of the Wall Street Journal, organized the material and presented it in a more coherent form. According to John Murphy in his book Technical Analysis of the Financial Markets, the basic tenets of Dow Theory are:

1. Averages discount everything.
2. The market has 3 trends.
3. Major trends have 3 phases.
4. The averages must confirm each other.
5. Volumes must confirm the trend.
6. A trend is considered to be in effect until there are definite signals showing otherwise.

As an example of a trend, Dow defined an uptrend as a time when successive rallies in a security price close at levels higher than those achieved in previous rallies and when lows occur at levels higher than previous lows.

The three phases of a trend are accumulation, a public participation, and distribution.
The idea that trends must be confirmed by volume is that when price movements are accompanied by higher volume, Dow believed this represented the "true" market view. If many participants are active in a particular security, and the price moves significantly in one direction, Dow maintained that this was the direction in which the market anticipated continued movement. To him, it was a signal that a trend is developing.

Elliott Waves

Another major school of charting is based on the ideas of Ralph Nelson Elliott (1871-1948), a professional accountant. His idea was that market prices unfold in specific patterns which practitioners today call Elliott waves. Elliott published his views of market behavior in a number of books culminating in Nature's Laws - The Secret of the Universe published in 1946.

He was not short on confidence writing that "because man is subject to rhythmical procedure, calculations having to do with his activities can be projected far into the future with a justification and certainty heretofore unattainable."

A major proponent of Elliott wave theory is Robert Prechter who has written many books on Elliott waves and their applicability to forecasting markets. One of his forecasts has been consistent: "One thing I've repeated consistently," he said, "is that the great bear market will take the DJIA at least below 1,000 and likely to below 400." Phew!

A few years ago I shared the platform with Prechter at a conference on finance and investing. After his talk, in my innocence I asked if the earnings of companies played a role in the stock price. He looked at me as if I was from another planet. In the official proceedings of the conference he stated that exogenous forces such as "economic reports, wars and peace treaties, terrorism, elections, corporate earnings, scandals, Fed actions and the movements of other markets" are irrelevant. "None of these classes of events," he continued, "has a leading or coincident relationship to stock price movement."

Another aficionado of Elliott is Richard Swannell who wrote software called Elliottician. I met him back in 2001 and asked if he had made any money from his system. He said that at that point he had not used it with his own money. Perhaps things have changed since then since I notice on his site that he states, "I will personally give US$100,000 to anyone who can prove they have a more accurate market forecasting tool than my Refined Elliott Trader (RET) software. This can include any software program or any technical indicator, whatsoever..."
Starting with Elliott himself and the quote by him above, then moving on to Prechter, and finally to Swannell, it seems that self-assurance is at stellar levels in the world of Elliott waves.
Gann Charts

The next main character in the Parthenon of the most influential charting proponents is W. D. Gann (1878-1955). His idea is that the market will make highs at integer multiples of the all-time low and the timing will be quantifiable from the value of this low. It also applies to a lessor extent to major lows and highs.

Gann believed that the ideal balance between time and price exists when prices rise or fall at a 45 degree angle relative to the time axis. This is also called a 1 x 1 angle since prices rise one price unit for each time unit.

Gann Angles are drawn between a significant bottom and top (or vice versa) at various angles. the 1 x 1 trendline was considered the most important by Gann and signified a bull market if prices are above the trendline or a bear market if below. Gann felt that a 1 x 1 trendline provides major support during an up-trend and when the trendline is broken, it signifies a major reversal in the trend. Gann identified nine significant angles, with the 1 x 1 being the most important. They are:

1 x 8 - 82.5 degrees
1 x 4 - 75 degrees
1 x 3 - 71.25 degrees
1 x 2 - 63.75 degrees
1 x 1 - 45 degrees
2 x 1 - 26.25 degrees
3 x 1 - 18.75 degrees
4 x 1 - 15 degrees
8 x 1 - 7.5 degrees

Gann observed that each of the angles can provide support and resistance depending on the trend. For example, during an up-trend the 1 x 1 angle tends to provide major support. A major reversal is signalled when prices fall below the 1 x 1 angled trendline. According to Gann, prices should then be expected to fall to the next trendline (i.e., the 2 x 1 angle). In other words, as one angle is penetrated, expect prices to move and consolidate at the next angle.

The following is a typical Gann chart or Gann fan. It would be interpreted as the index "bouncing" off the 2 x 1 and 1 x 2 lines.


One major stumbling block with this method is that it depends on the units that you use since it is not always practical to give the 1x1 line a value of 1 point of price for each day. For example, if the S&P500 is trading around 1,500 it is not sensible to consider time units of 1,500 days. Gann said that another scale should be used but did not give any rules as to how to do this.

Although you can buy computer packages to implement the above methods, as they were described by their developers the forecasts could be achieved using pencil and paper with a bit of care. The next step in the story is to use computer tools to search for patterns and features as part of making forecasts. This is called technical analysis and the following are brief descriptions of some of the main technical analysis tools

Moving Average Operators

This is an average, possibly weighted, of prices over some time period. Because it is an average, it is filters out high frequency data which could be interpreted as noise and so is taken to represent more accurately the "true" trend of the price. They are referred to as low pass filters.

Long-term moving averages are slower to respond than short-term moving averages. Hence a typical trading rule would be to buy when:

1. the 50-day moving average is slowly rising, and
2. the 15-day moving average crosses from below the 50-day moving average.

Another way of describing these two requirements is that there was an overall slight upward trend which has recently accelerated.

Channel Breakout Operators
A channel is the price range between the lowest price and the highest price over a period of time. If the price moves above the high end of the channel, then it is a signal to buy. Conversely, if it breaks out in the lower direction, it is a signal to short the stock.

Stochastic Operators

These operators measure the relative position in a channel over a specified number of days. Because it is a relative position, it removes the trend in the price within a moving channel. It is the opposite of moving average operators since it acts as a high pass filter.

Bollinger Bands
This approach was developed by John Bollinger in the early 1980s as a band whose position and width varied according to price movements. The band consisted of three curves. The middle curve is a measure of the intermediate-term trend, usually a simple moving average, that serves as the base for the upper and lower edges. The interval between the upper and lower edges and the middle edge is determined by volatility, typically the standard deviation of the same data that were used for the average.

A typical example of a Bollinger band is:

Middle Curve: 20-day simple moving averageUpper Edge:
Middle Curve + 2 * 20-day standard deviationLower Edge:
Middle Curve - 2 * 20-day standard deviation

It is an adaptive extension of the stochastic operators mentioned above.

Some traders treat the band as a type of "break out operator" and buy when the price breaks above the upper edge and sell when it drops below the lower edge. Others take the opposite view and buy when prices touch the lower edge and exit when price touches the moving average in the center of the bands.

Geometrical Patterns

There are an unlimited range of classes of geometrical patterns. These include the famous head and shoulders pattern with the suggestive rule that you should short the stock when the neckline support is broken.

These should be enough examples to show you that the range of charting and technical analysis systems and rules is enormous.

Two question remain. Why should we expect any of the rules to generate excess profits where by excess profits I mean trading profits above what could be expected by random guessing after transaction costs? And, even if we expect any of them to work, is there any evidence that they actually generate excess profits in a practical way?

The Elliott Wave group tend to state that their approach is based on real physical laws of nature related to the rhythms of nature. In this regard, Robert Prechter likes to emphasize the role of herding. He wrote, "Under conditions of uncertainty, people instinctively impulsively herd together and make decisions in ways that have little to do with conscious, rational thought." It is this herding, according to Prechter, that gets expressed in predictable patterns of waves.

Others take a more insouciant view about technical analysis. For example, in their book Technical Analysis of Stock Trends, Robert D. Edwards and John Magee write, "We can never hope to know why the market behaves as it does ? History obviously has repetitive tendencies and that's good enough."

The trouble with the view that "if it works, its good enough for us," is that there is scant consistent evidence that it actually does work. For example, in a recent book Evidence-Based Technical Analysis by David Aronson, the author back tested 6,402 technical analysis rules on the S&P 500 over the period from November 1, 1980 through to July 1, 2005.

In the final chapter of the book, Aronson writes: "No rules with statistically significant returns were found." None, zero, nada.

Looking at the above rules and strategies from the perspective of a mathematician, I can't help seeing a type of "gee whiz" quality about them. The people who discovered them have applied mathematics to what seemed a chaotic situation and are excited about their achievements. This seems to be followed by an attachment to their ideas.

In each case the mathematics is really very elementary but nevertheless generates some attractive charts and images. The trap is that mathematics is incredibly seductive but, for many, also very intimidating. As soon as a few equations or geometric figures are attached to some price charts, many people think that the methods are legitimized and proven.

More Advanced Methods

Even though Aronson tested over 6,000 rules, he just scraped the surface of known technical analysis strategies. For example, there is a whole range of attempts to use neural networks and more sophisticated mathematics such as fractal geometry and chaos theory to understand and forecast markets.

As an example, a paper by Mark Leung, Hazem Daouk and An-Sing Chen in the International Journal of Forecasting titled Forecasting stock indices: a comparison of classification and level estimation models looks at technical analysis forecasting using a range of mathematical techniques including probabilistic neural network methods. They show that from January 1991 through December 1995 their methods on average outperformed the S&P 500 FTSE 100 Nikkei 225 Index by 3.59% per year.

Better than nothing, but not too remarkable given that no mention was made of transaction costs and slippage which is the inability to actually make the transaction at the quoted price. Also there was no discussion of draw down levels and possible excess volatility of the trading portfolio.

Another approach is to use intermarket influences. For example, currently I am an examiner for a PhD thesis in which the candidate uses neural networks to see if information on overseas markets can be used to make successful daily movement forecasts of the All Ordinaries Index in Australia.

At least qualitatively, it is well recognized in today's world of global markets aided by high-speed international financial transactions there is a lot of influence between markets. Here are a few recent headlines in Australian newspapers: ?Stocks in the red after Wall St slide?, ?Stocks follow Wall St lower at open?, ?Stocks open flat after Wall St fall? and ?Stocks open higher after Wall St rebound?.

In fact, it is a cause of some surprise when the Australian market does not follow the US as indicated by the recent headline in Australia: ?Stocks surge despite US malaise?. The PhD thesis uses neural networks to show that it is possible to increase the accuracy of forecasts on the Australian market by using information from overseas markets.

As far as anyone can tell, the trading system with the most computational power has been developed by the Prediction Company in Santa Fe. It is a secretive company founded in March 1991 by a group of physicists with the aim of coupling the most advanced mathematics with a powerful array of computers to conquer the world's markets. On their website they claim:

Our technology allows us to build fully automated trading systems which can handle huge amounts of data, react and make decisions based on that data and execute transactions based on those decisions - all in real time. Our science allows us to build accurate and consistent predictive models of markets and the behavior of financial instruments traded in those markets... We have a substantial track record with excellent results.

We don't actually know if these claims are true. Reading the book The Predictors by Thomas Bass describing the years of the Prediction Company up until 1999 certainly gives rise to doubts. For example, describing the state of development in 1995, Bass writes, "Their ambitious project to model the most actively traded stocks on the New York Stock Exchange remains a gleam in their eye." He explains that that at the time they are waiting for a new methodology to come online.

This is such a common story told by technical traders. Everything will be fine and the profits will start to roll in. All that is needed are a few adjustments to the parameters, or the purchase of a new software system from the suppliers which just happens to cost twice as much as the current version. The successful results always seem to be imminent, just around the corner.

Up until the writing of the book by Bass, eight or nine years after the formation of the company, this was the story with the Prediction Company which is now fully owned by UBS Warburg.

Even if you have a trading system that works in principle, there is the problem of human greed and arrogance when it comes to its implementation. Consider the case of the Nobel laureates and other founders of Long Term Capital Management who ended up causing the company to lose billions of dollars. Roger Lowenstein provides a vivid account in When Genius Failed of how some of the most brilliant minds in the financial world dazzled bankers around the world with their reputations and mathematical prowess before crashing to earth caught in their own hubris.

For a period in the mid 1990s I also became caught up in the goal of creating a super trading system. I wrote a large amount of code for trading markets using volatility. My observation was that after periods of high volatility, the volatility settles back to its long-term levels. Similarly, after periods of low volatility it will increase to its long term levels. The idea was to be able to measure accurately the volatility relative to its long term levels and to trade it accordingly using instruments such as option straddles.

The other tool that I was involved with at the time was looking for inconsistencies in foreign exchange option markets. At least this one had a concrete outcome; I wrote a book with Valery Kholodnyi on the topic called Foreign Exchange Option Symmetry.

The Randomness of Markets
Over the short-term, it is very difficult to show that charts formed by equity prices and indices are anything but random. In fact, trillions of dollars of contracts and transactions are based on the assumption that they are random. As an example, the ubiquitous Black and Scholes formula for option pricing that you see in the annual reports of most companies valuing the options awarded to management and staff assumes that the prices follow a random walk.

Specifically, the formula assumes that the returns on a day by day basis follow a normal or Gaussian probability distribution. Most agree these days that these distributions are not normal, that they should have fatter tails. On this topic I recommend Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets by Nassim Nicholas Taleb.

The point is that even if we don't agree on the type of randomness, to a very high degree price performance is extremely close to random and it is only with the use of sophisticated mathematics and powerful computers that they can be distinguished from random.

Methods for generating random graphs throw up all the features of charting and technical analysis described above. A number of studies have been carried out with chartists to see if they could distinguish between actual price charts and randomly generated charts. They could not.

The chart below gives an example of how a simple computer program can be used to build up a chart in systematic, but random, way so that the final result looks very similar to a typical price chart.

The chart is taken from the introduction to Derivatives and Financial Mathematics edited by John Price
Such charts can show all the features of typical price charts but because they are randomly generated, it is not possible to use past data to make any forecasts of future behavior.

Since this is part of a series of articles on finding value in the stock market, it is only fair that I state what I think about charting and technical analysis as a method for finding value. Despite all the marketing hyperbolae I think that it is highly unlikely that anyone working on their computer at home with a retail trading package (or one that they designed themselves) is going to make money in anything like a consistent manner. If you have 1000 people, some will make money during any given year. But over time, most people will probably lose money.

Perhaps if you had access to the forecasts of the Prediction Company, and you had very deep pockets to weather the drawdowns, you might be successful.

My advice if you are tempted to purchase a trading package: ask to see the actual trading records on the tax returns of the person who wrote the code and of the senior personnel in the company.

In the fourth article in this series I am going to look at the valuation methods of Benjamin Graham, the man who is called the Dean of Wall Street. Apart from describing his methods, I will also examine their relevance in today's markets.

Friday, November 16, 2007

about TELUGU Language

On a recent visit to Hyderabad I overheard a conversation that Telegu and Sanskrit had a lot in common. I was perplexed, asked a few Andhraites but none could give me answers. This aroused my interest in reading, sharing about the development of Telegu language and literature. This article is verbatim from The History and Culture of Indian People by the Bhartiya Vidya Bhavan. After that compared notes with The Cultural Heritage of India by the Ramakrishna Mission and made additions.

Origin - Telegu (T) is spoken by a number of people who are contained within a semi-circle drawn with line joining Rajahmundry and Madras as diameter. Adjoining areas like Hyderabad, Orissa and Mysore account for a large number of Telegu speaking people while there are scattered groups in Tamil Nadu too.

While the Telegu country is called Andhra Desa much ink has split whether or not Andhras and Telegu speaking people are the same. According to Dr C R Reddy they are a South-Indian tribe that assimilated Aryan (to mean Arya or cultured) culture and some elements of Prakrit but retained racially and linguistically its essential Dravidian character.

T is construed in different ways. Tene in T means honey hence T is explained as the language that is as sweet as honey. T is also derived from Trilinga the country that contains three lingas at Kalesvara, Srisaila and Draksharama. The T script is phonetic, after the manner of Sanskrit and bears a close resemblance to the Kannada script. Another view is that perhaps T is connected with ‘Telinga’, the name of a tribe that must have originally lived in the region. This assumption is supported by existence of Telagas, a major agricultural community in Andhra, and Teleganyas, a sub-sect amongst Andhra Brahmins, and also by the name Telengana, denoting a part of the Andhra region. It is thus possible that T was originally an ethnic name.

The earliest reference to the Andhras as a name denoting a tribe of people who migrated to the south of the Vindhyas is found in the Aitareya Brahman. They are also mentioned in the Ramayana and Mahabharata. It is possible that the Andhras were migrants from the North, and their political / cultural domination over the people in the T region would have given their name to the country, the people, and later on to the language.

The political and social history of the Andhras can be sketched from the times of the Satavahana rule 263 BC to 163 AD. The geographical position of the Andhra region as a meeting place of the North and the South has been a dominant factor in the political, social, cultural and literary development of the Andhras.

We have inscriptional evidence for T from the 3rd century B.C. with the commencement of the Satvahana rule in the Deccan. A few T words, mostly names were found in the Prakrit inscriptions of the Satavahana kings and their successors. They occur in greater number in the Sanskrit inscriptions from about the middle of the 4th century a.d.

The language came under dominating influence of Sanskrit and Prakrit, which were the upper languages during the first five centuries after Christ. T inscriptions are available from the 6th century a.d. and until the beginning of the 11th century we have about 100 such inscriptions written in poetry or prose.

T is a borrowing language, and it seems to have started borrowing from Sanskrit since its infancy. Sanskrit, always held a superior position in Andhra, and it was the language of education and scholarship among the Telegu people till the advent of the British rule. Proficiency is Sanskrit was considered indispensable to a T poet or scholar. The impact of Sanskrit on T was so great that until recently T poets and grammarians eulogized Sanskrit as the mother of T.

The period covered is 1000 to 1947.

1000 to 1300 a.d.

Desi and Margi - T literature had two streams, an earlier one called Desi (folk literature – an oral tradition) and a later one called Margi (Sanskrit). Desi literature was rural, popular and independent of Sanskrit. The Margi literature was a deliberate concoction at first – a rich, exotic and stimulating beverage meant for the sophisticated court and urban population. Nanniah’s Mahabharata which was composed in 1030 AD stands at the head of Margi literature. The epic seized Nanniah’s imagination and set it flame. And at that moment formal T literature was born.

Telegu Mahabharatam - Although Nanniah began the work of giving Vyasa’s Mahabharata a Telegu habitation and a name; he was able to complete only the first two parvas or cantos and a part of the third, the Aryanya Parva. After him for two centuries it remained as it is till in the latter half of the 13th century arose another poet Tikkanna and fifty years later Errapragada who finished the T Mahabharatam.

Tikkanna 1220-1300 heralded a new era in T literature by making a fine synthesis between the marga of Nanniah and the desi of Palakuriki. A minister to a feudal king, he worked for the political unity of Andhra-desa with a view to averting a possible Muslim invasion. As an advaitin, he also worked for the religious unity of the Hindus by establishing the Hari-Hara cult. He brought together the Saiva and non-Saiva schools of T poets. His first work was the Ramayanamu written in kavya style.

Nanniah, Tikkanna and Errapragada are the great trio or the Kava Traya of T literature. The first is remembered for reducing Vyasa’s epic to into a mellifluous and transparent T. Although he uses more Sanskrit than T his simplicity and ease are irresistible. Besides the Mahabharatam Tikkanna wrote a poem covering Rama’s life after his return from Lanka. His style is described as that of an intellectual. The T version is written in a mixture of prose and poetry. There is less philosophy, more luxurious description, less depth but more humanity. The last of the trio was the first poet to render Harivamsa into T. His Nrsimha Puranamu is considered a landmark, as it initiated a literary type called Prabandha in T literature – referred to in next period.
Apart from the great work of Nanniah the 11th and 12th centuries saw little original literary activity. However, mention must be made of P Mallanna and E Peddanna, both of whom wrote Mathematical treatise in T based on translations from Sanskrit originals. During the 13th century two versions of Ramayana appeared. Of these, the version in couplets attributed to Ranganatha is the earlier. The other version known as Bhaskara-Ramayanam is in champu form. Although widely read the Ramayanam does not reach up to the beauty of the T Mahabharatam.

The 12th and 13th centuries witnessed major political, social and religious changes in Andhra. Teachings of Basavesvara, prophet of Virasaivism, disturbed the social fabric of the people both in the Karnataka and Andhra regions. Virasaivism became partly a mass movement and its gospel was to preached through the literary works by Saiva poets. He advocated the desi against the margi tradition. Though the movement died with him, it did influence the writings of later poets.

1300 to 1526

14th century saw the downfall of the Kakatiya empire and the rise of a number of small kingdoms like the Reddis of Kondavidu. Errapragada refererred to above was the court poet of Prolaya Vema, a Reddi king. His first work was Ramayana in Champu which is now lost. His junior contemporary and rival was Nachana Somanadha a great scholar. In his translation of Uttara-Harivamsa by his artistic descriptions, lively conversations and beautiful imagery, he converted the Puranic story into a work of art, leading the way for Prabandha of a later age.

The next 150 years i.e. from 1350 to 1500 may be said to be the age of Srinatha since he happened to be the dominating literary personality during that time. He visited various courts, parts of the country and took part in literary and poetic contests had the honor of Kanakabhisheka at the hands of Devaraya II of Vijayanagara. His first work and first of its kind in Telegu was the translation of Salivahana Saptasati from Prakrit to Telegu. He had great leanings towards Saivism and translated Bhimesvarapuranam and Kasikhandam dealing with holy places connected with Saiva worship. In Palnati-Viracharitra he entered the field of historical romance. Not only did Srinatha love his country with great devotion but he also depicted the life and manners of various parts of the country with a humorous touch of all his own in his Vidhinataka.

Potana, another outstanding poet translated the Bhagvata into Telegu and chose to dedicate it to Lord Ram, the human incarnation of God, whom he worshipped with devotion. A number of works dealing with stories connected with the exploits or deeds of personages of old were produced during this period. Jakkana’s Vikramarkacharita deals with stories connected with King Vikrama. Navanadhacharitra of Gaurana describes the exploits of the Nine Nadhas or Saiva saints. Suranna translated the Vishnu Purana. The Upanishadic story of Nachiketas was developed in Telegu by Daggupalli Duggana in a work called Nasiketup-akhyana. The Vaishnavite influence in Telegu literature began to be felt during this period.

Thus though the translation of the Puranas, Kavyas, short stories, mahatmyas and scientific works also supplied the topics for the poets during this period. All these seem to have marked the gradual transition of Telegu poetry from Purana to the classical period –Prabandha. It is also called the golden age of Telegu literature during the reign of Krishnadeva Raya.

Krishnadeva Raya 1505 to 1529 – was a patron of art and letters. All the famous artists were employed to decorate his palaces and temples. Though he extended his patronage to writers in all languages including Sanskrit he specially favored T. The Augustan age of T literature, which began, with the accession of Saluva Narasimha burst forth in full splendor during his reign. Himself the author of Amukta-malyada, one of the greatest poems in the language. It deals with the story of Andal and Vishnu Chitta, two of the Alvars, eminent advocates of a true Vaishnava Bhakti Cult in South India. It is a storehouse of his personality, scholarship, worldly wisdom, knowledge of political science, religious understanding and Bhakti or sense of devotion to God. He loved to surround himself with poets and men of letters. His court was adorned by a group of eight eminent T poets called Ashtadiggajas or the elephants supporting the eight cardinal points of the literary world. He created the concept of a scholar king, one of whose important duties was to protect poets and men of letters and foster the growth of language and literature. It was thereafter recognized by all Telegu kings that one of their principal duties was to patronize T poets and learned men, encourage the growth of literature. As a consequence notwithstanding the many political changes subsequently T literature flourished owing to the patronage of the generations of princes and chiefs who bore sway over the land.

After reading about this period I have realized that political stability is very important for development of art and literature. An enlightened King can play a pivotal role in encouraging arts.

1526 to 1707

As mentioned above the reign of Krishnadevaraya, Emperor of Vijayanagara ushered in a new era of T literature. Earlier the literature consisted of translation, adoption and imitation of Classical Sanskrit literary models and traditions, particularly of epic nature. The age opened new elements, neo classical vistas and romantic panoramas. The Prabandha, essentially of the kavya type, now occupies the place of eminence in T literature. Even if the themes of several Prabandhas were borrowed from the treasure house of Sanskrit literature, they had an original flavor in treatment, a dignity in diction and element of their own in sentiment, description, and ornamentation. Composers of the major Prabandhas poets were of high caliber that made use of their background of Sanskrit lore and followed the traits of tradition to a certain extent not because of their inability to innovate but because of the attitude prevalent among scholars and people at that time.

The court of Kraya was named as Bhuvana Vijaya (the victory of worlds). Every year Vasantotsavas i.e. spring festivals used to take place at Vijayanagara and poets were felicitated there. One of his court poets Allasani Peddana was the best of the lot, author of Manu Charitra was honored with the title ‘Andhra kavita pitamaha’ i.e. the creator of poetry in T. The masterpiece has as its theme the story of Svarochisha manu drawn from the Sanskrit Markandeya Purana. Its characters Pravara and Varudhini assumed a new lively dimension and almost became proverbial with the Telegu public. They are quoted as classical examples for austere charity and erotic fillip in contexts of social episodes of love and romance, and became a model for the following generation of Prabandhas.

Nandi Timmana another great poet of the same court composed a beautiful poem Parijatapaharana Prabandha based on the romantic episode of the Parijata or the achievement of Satyabhama in the Harivamsa. Scores of authors drew inspiration from him and composed similar works. The melody of Timmana’s style together with the texture of Peddana’s diction and the lavish display of phrase of Tenali Ramakrishna, has been often applauded by the lovers of T literature.

Besides the Amuktamalyada by Krishnadevapraya in the earlier chapter the other four great poems in Telegu are Manu Charitra, Vasu Charitra, Raghava-Pandaviyamu and Panduranga-mahatmyamu. Together they form the ‘Panch maha-Kavyas’ the give great poems in T.

Dhurjati, though a staunch Saivite poet, was much respected by the Vaishnavite ruler Krishnadevaraya. He composed a shrine-epic poem Kalahastimahatmyamu and is famous for elegance of characterization and unrivalled in sweetness of expression. He always spoke in poetry with heart and soul. Tenali Ramalinga a poet was very popular in T country – more for many anecdotes about him than his poetry.

Pingali Surana was another important figure. In his Raghava-pandaviyamu evolved a poetic type in Telegu – Dvyarthi i.e. double entendre in which stories of the Ramayana and Mahabharat run parallel in single expression. This style is an acrobatic feat in a poetic form in which pun strikes the keynote. His Kalapurnodayamu (meaning the birth of kapapurna by name) is another great work of all time. A threefold allegory and a comedy of errors are there in its theme. This was a new experiment in T poetry in all its aspects. It seems that only during the 16th century did the T poets become conversant with the theories of the various important aesthetic schools in Sanskrit – the schools of Rasa, Rati, Dhvani, Auchitya, Vakrokti and Chamatkara. But none of them except Surana got the will to tread into the realms of all these Sahitya Prasthanas in one stroll. His Prabhavati-pradyumnamu is another unique poem in T. Surana in the opinion of Dr C R Reddy was an unknown rival to Shakespeare in the East at that time. Thanks to our education system we are more familiar with Shakespeare than Surana.

Bhattu Murthi lived in the middle of the 16th century. His poetic genius, craft were outstanding. He was well versed in music too. His Kavyalankara Sangraha is a standard work on poetics in T. He was a great architect of verse and his Vasucharitra was translated into Sanskrit and Tamil.
The rulers of Golconda particularly Malik Ibrahim extended patronage to T letters for a time during the latter half of the 16th century. One of the books produced was Yayati-charitra a poem composed in pure T devoid of any Sanskrit vocabulary, a first and was followed by a host of such works.

Summing the characteristics of this age it may be said that the new spirit of pompous imperial age led to neo-classical innovations and romantic enterprises. The exuberance of scholarship and enthusiasm of the poets was channeled into various new types like Prabandha, Dvyarthi and new features like Slesha, Chitra, Bandha and into various Paurani themes with a few exceptions blended with new aesthetic values and various major sentiments, the predominant being Sringara and into a variety of descriptions charged with flights of imagination and above all a grandiloquent diction. This was the age of aesthetic considerations in the history of T literature.

After the fall of the Vijayanagara empire in 1565 we witness an age of decadence in Telegu literature in the Telegu country from early 17th century onwards almost up to the dawn of the modern age. But a few poets worth the name flourished during the period. Four of the Nellore Friends’ Circle flourished in early 17th century. Kankanti Paparaju composed Uttara Ramayanamu, Tekumalla Sayi composed Vanivilasa Vanamalika, a miniature cyclopedia, the first of its kind in T. Pushpagiri Timmakavi composed Samira Kumara Vijayamu and Ramamantri composed Dasavatara Charitra. All these works are original compositions and not translations.

Never before and never after in the history of T literature were so many royal poets as were in this age. Malli Ananta, Kumarananta, Damera Ankabhupala to name a few. History for the first time formed the theme for two narrative poems during this period. Ramarajiyamu deals with the story of Aliya Ramaraya a great dictator and a king for sometime of Vijayanagara. The second Krishnaraya Vijayamu deals with the story of Krishnadevaraya of Vijayanagara. Appakavi, a great grammarian wrote in verse a prolific commentary, Appakaviyamu by name, on the Andhra-sabdachintamani, the first treatise on Telegu grammar written in Sanskrit. There were other not so famous writers on T grammar too.

There is another phase in the history of Telegu literature during the 17th century. Literature was produced outside the T country namely in the courts of Tanjore and Madura. The court of Tanjore ruler Raghunatha Nayaka 1600 to 1632 was described by Rama-bhadramba, his court poetess in her Raghunathabhyudaya. The King himself was merited in Telegu and Sanskrit and composed a number of T poems himself. The King’s children along with poet laureate Rangajamma composed Yakshaganas. It assumed the full stature of a regular play, removed the dearth of dramatic literature in T and had its heyday not only then but even in the next 150 years when the Maratha ruled over Tanjore. The Court of Madura too produced a number of T works. You might wonder that Madura is in Tamil Nadu so how did T develop there. Please note that states like Tamil Nadu or Andhra Pradesh were formed only after independence.

Overall this period saw the blossoming of T literature.

1707 to 1815

Changing political and cultural environment of the 18th century had its impact on the growth of T literature. The rise of foreign pockets and petty principalities brought about a radial change in the character and quality of literary works of the period. Earlier poetry used to be monopoly of the niyogi Brahmins only. With the change, vaidika Brahmins and many non-Brahmins began to compose, write treatise on grammar, commentaries on classics besides imitations of the great prabandhas of the past – all of which indicate that creativity of the authors was at low ebb. The dialects and slang’s of various parts of the country and foreign vocabulary found place in the composition of the times. The religious unrest of the day and society was reflected in literature too. As early as 1712 Christian missionaries introduced the printing press and things like coffee and tobacco. It was a period of transition where adherence to tradition was being overpowered by an urge to change.

Kasturi Rangakavi’s Sambanighantuvu is a lexicon of pure native T. Timmakavi of the famous Kuchimanchi family in the East Godavri district was a prolific author and master in pure T compositions. Glimpses of contemporary society are discernible in his work called Bharga Shataka and Kukkuteshwara Shataka. The plight of a common peasant is portrayed in Bharag Shataka.

Adidam Surakavi, son of Bala Bhaskar, was a famous poet of those times and was much feared for his biting tongue. Kavi Samshaya Viccheda is a treatise on some special aspects of T grammar; Andhranamaseshamu is a small dictionary of the pure T vocabulary in verse form are some of his works.

Gogulapati Kurmanath kavi was undoubtedly a great poet of the century. His Mrityunjaja Vilasamu is in a class by itself in Yakshagana literature. His Simhadri Narasimha Shataka is historical in its appeal. It gives us a vivid picture of the unhappy results of the Muslim inroads into our country and destruction of temples. Mangalagiri Kavi a Brahmin poet of the 18th century gave up an effective portrait of Jesus Christ in his Vedanta rasayanamu.

Telegana consists of nine districts in Andhra Pradesh, which was earlier part of Nizam’s domain. Due to a number of small principalities T literature flourished during that period. Independent of any patronage Lingamurthy Parashuram Panthulu, belonging to Maratha stock, composed a great work called Sitaramanjaneya Samvadamu wherein good poetry is coupled with lucid exposition of the advaita philosophy. It became a handbook of every teacher and preacher of philosophy in the T country. The Maringati family of Nalgonda district too is reputed for its generation of scholars. Kiriti Venkatacharya, a distinguished scholar poet of this family composed 13 works. His Achalatmaja Parinayamu is in double entendre, wherein the marriage stories of Sita and Paravati are woven into one.

Shahji Bhonsle 1684 to 1712 the eldest son of Ekoji I, the founder of the Maratha rule in Tanjore and the stepbrother of Shivaji, composed 22 plays in T. Tukoji brother and successor of Shahji wrote in T too. He conferred the title of ‘Andhra Kalidasa’ to Aluru Kuppana, the author of Parthasaratha Vijaya. Shri Narayana Tirtha, author of the famous Sri Krishna Leelatarangini in Sanskrit composed Parijatapaharanam Yakshagana in T. The great celebrity in the world of music, Tyagaraja, was just coming into limelight through his unique skill in the art of music and compositions.

The Madurai court witnessed its golden age during the reign of V Chokkanath 1706-38. Its speciality lies in its crop of prose literature and erotic poetry. T literature received some patronage from the Maharaja of Mysore too.

To conclude there was indeed a rich crop of literary works during this period but not of the highest quality. Some of them were translation from Sanskrit and others imitation of T classics. A variety of themes are also seen in the literature of that time. Classical themes like the epics were respected but contemporary and foreign themes were accepted. The general trend of literature shows slackness in the traditional texture, decency, soundness and considerable craze for experiment.

1818 to 1905

As said earlier the 18th century witnessed a considerable decline in T literature. The major part of the literature published during the first half of the 19th century was poetry, particularly the Satakas. There were large in number and showed a greater amount of originality, moral instructions, social element, human touch and spirit of lyricism than other poetic forms. Phakki Venkata Narasayya, Vasurayalu were some of the eminent Sataka writers. There also flourished a host of poetesses the most famous of whom was Vengamma.

The founder of modern T literature was Rao Bahadur K Viresalingam. He was influenced by English literature and the theme of his first novel Rajasekhara-charitramu, was suggested by Goldsmith’s Vicar of Wakefield. Later Rao became a member of the Brahmo Samaj and used his pen to advocate social reforms and advanced ideas in all spheres of life. His Andhra Kavula-charitra was the first attempt to write the history of T literature.

The drama of the modern type was evolved from older forms of play writing like the Vidhinataka and the Yakshagana. The age old Yalshagana had almost developed into a full blown type of native drama by the time it made its debut in the 19th century. The great Tygaraja, Sivaji were some of the important authors of that time. In the latter part of the 19th century the Dharwar Dramatic Company came into the scene. T writers of that time were very fascinated by their performances and took to writing plays on modern lines. Some made translations from Sanskrit or English or new themes but adapted the Sanskrit norms or the English in the technique of composition.
Viresalingam did a pioneering work in this direction. As the first novelists he is hailed as the father of modern prose literature in T. Gurajada Apparao may be hailed as the father of modern short story in T.

During the second half of the 19th century appeared a lot of prose works dealing with moral fables, stories of pilgrimages, subjects like politics, law and so on. Chinnayasuri’s Nitichandrika won reputation as a classical specimen of modern T prose in a very chaste and lucid style. Some have employed colloquial style and satisfied the growing need of a vast majority of the public. As it gained momentum, the novel, the short story, the essay and the like, the essential form of which is prose came into being.

Eminent services were rendered by the Europeans to the cause of T literature during the first half of the 19th century. W Carrey etc wrote T grammars, published dictionaries. Col Mackenzie etc took pains in collecting manuscripts of old works.

1905 to 1947

There was an all round progress in different branches of T literature during this period. In particular two decades 1915 to 1935 have been regarded by some as the most brilliant period in Andhra literary history. An important reason was the almost revolutionary change brought about G.V.Ramamurti Pantulu died 1940 who emancipated it from archaic grammar and introduced the spoken language as the vehicle of literature. He was the pioneer in the field of journalism in T and through his journal Viveka-vardhini 1874 he propagated modern ideas, fostered fresh creative art in literature and launched attacks against social evils and superstition. His translation of Kalidasa’s Abhijnana-Sakuntala 1883 is still the best.

Lyrical poetry reached a high degree of excellence, its main theme being love in its various forms and appreciation of the beauties of nature. Nanduri Venkata Subbarao wrote exquisite love lyrics in the series Yenki-Patalu or the songs of Yenki regarded by some as the most beautiful love poems in modern Indian literature. Two other great poets were Visvanatha Satyanarayana who was also called kavi-samrat and Devulapalli Krishnasastri who came to be known as the ‘Shelley of Andhra’. The latter like many others belonged to the old school. Srirangam Srinivasa Rao belonged to the progressive school while Neo Classicism is represented by poets such as G Joshuan.

The poet Visvanatha Satyanarayana also wrote novels. His best-known work is Veyipadagalu (thousand snake hoods), which gives a comprehensive picture of the present Andhra society. T seems to be richer in short stories than long novels.

Fiction in T had its origin in the 17th century, but it was then in the form of a narration of a Puranic story or a fairly tale with little artistic merit. It was not until the 1870’s that novels in the modern sense came to be written in T. Early T novels were translations of English or Bengali ones.

Dramatic literature also made good progress. Gurujada Apparava is the author of the first noteworthy social drama, Kanya-sulkamu (bride price). There were other authors of social plays and historical plays like K S Rao who wrote on the fall of Vijayanagara.

The period showed remarkable progress in essays, particularly in works of literary criticism. Although the honor of being the first essayist in T goes to SN N Naidu for his Hitasuci 1862, a collection of eight essays, the essay in the modern sense started with Viresalingam.

A very important role was played in the development of T literature by the Sahiti Samiti – a sort of literary fellowship – founded by Sivashankara Sastri, the Anna Guru who attracted around him a number of brillant writers, poets, short story writers and essayists. The movement in favor of adopting the spoken language as a literary medium was inaugurated by G V R Pantulu, but its success was assured by the practical adoption of this medium by this group of writers. It is not unlikely that they were inspired by the Sabuj Patra movement in Bengali literature.

The autobiography of T Prakasam is an outstanding work in T literature. The Swadeshi movement in Bengal in 1905 had also a great repercussion on T literature. Several novels written in Hindi and European languages were also translated in Telegu.

Marxist ideas also had their impact on modern T poetry. However, they could make an impact only post-Independence. These writers have a new attitude to poetry, which is noticeable both in the form and content of their works. S S Rao ranks high among these poets.

Friends as a North Indian I am truly impressed with the language and literary developments of Telegu. Am sure that as I read about more Indian languages there would be similar developments there too. Thanks to convent education that stresses the importance of knowing English language and literature and not knowing Indian writings a number of us including myself have stopped reading Indian languages. I am convinced that all of us who do so are missing out on reading some great literary works.

By compiling this article I have found an answer to my question – the commonality and relationship between Telegu and Sanskrit.

Long Live Kshatriya Dharam