When it comes to web programming the front runners can be easily said as PHP and ASP.NET..


ASP.NET:
If you program in ASP.NET you'll typically get too responses from the other side. Either you're rich (or your company is) or you're a Microsoft lover. While the name comes from Microsoft's old ASP technology, they made a huge leap with the .NET Framework, and the CLR allows you to use other languages for back end processing: typically Visual Basic.NET or C#.

ASP.NET's strength lies in object oriented features, and it's flexibility. Because of the CLR you can have C# programmers and VB.NET programmers working on the same project, or switch languages half way through and not have to rewrite all of your old classes. The .NET class library is organized into inheritable classes based around particular tasks, such as working with XML or image manipulation, so a lot of the more common tasks have been already handled for you.

Visual Studio .NET is a massive development IDE that will shave tons of time of your coding. It has built in debugging along with IntelliSense, which allows for auto-completion of methods and variables so you don't have to memorize everything.

On the down side, ASP.NET is expensive. One it uses tons more resources on the web server so you'll require either better server or more servers in the farm.  It's extremely rare for an ASP.NET app not to be running on IIS. They are a lot of downsides with IIS as it is bug-prone!!

PHP
PHP works in combination of HTML to display dynamic elements on the page. PHP only parses code within its delimiters, such as . Anything outside its delimiters is sent directly to the output and not parsed by PHP.

PHP strength lies mostly in LAMP. The LAMP architecture has become popular in the Web industry as a way of deploying inexpensive, reliable, scalable, secure web applications. PHP is commonly used as the P in this bundle alongside Linux, Apache and MySQL. PHP can be used with a large number of relational database management systems, runs on all of the most popular web servers and is available for many different operating systems. This flexibility means that PHP has a wide installation base across the Internet; over 18 million Internet domains are currently hosted on servers with PHP installed.

With PHP 5 finally came exception handling and true OOP, but it still lack namespacing to prevent class naming collisions. PHP's type checking is very loose, potentially causing problems. Another drawback is that variables in PHP are not really considered to have a type



With CMS like Drupal which are now the heart of website building, PHP has become extremely useful as the entire drupal package is coded in it...... Corporations have to wake up and choose for what they want instead of going with the blunt saying:


"Its worth only if u buy!!!"

Wednesday, November 4, 2009

ADAM- The Robot Scientist!!!


Scientists at Aberystwyth University and theUniversity of Cambridge in the UK managed to create world's first robot that can carry out its own experiments, produce hypotheses as well as make scientific discoveries. Researchers dubbed their latest invention Adam.
Working on its own, the robot-scientist already managed to find new functions for several genes of Saccharomyces cerevisiae, also known as brewer's yeast.
The lead-researcher of the project is Ross King, a computational biologist at Aberystwyth. He says that up till now Adam made modest findings, but all the discoveries were real. Their latest invention consists of a room equipped with different laboratory instruments. It includes 4 personal computers that work as one brain. In addition, Adam has robot arms, a number of cameras, liquid handlers, incubators and more.
Performing the Experiment
Scientists gave their latest invention a freezer with a collection of mutant strains of yeast where individual genes were deleted. The robot was also provided with a database that contained information on the yeast genes, enzymes and metabolism, as well as a supply of hundreds of metabolites.
In order to find which genes coded for which enzymes, the robot cultured mutant yeast that had a specific gene deleted. Then it analyzed the way mutant grew without a certain metabolite. In case the strain was spotted to grow not very well, Adam registered new information about the function of the deleted gene.
It is worth mentioning that Adam is able to perform over 1,000 similar experiments daily. So far, the robot came up with and tested 20 hypotheses about the coding of genes for 13 enzymes, from which 12 were confirmed by researchers, who carried out their own experiments.




A microscopic component that can "remember" electrical states even when turned off. It's expected to be far cheaper and faster than flash storage. A theoretical concept since 1971, it has now been built in labs and is already starting to revolutionize everything we know about computing, possibly making flash memory, RAM, and even hard drives obsolete within a decade.

The memristor is just one of the incredible technological advances sending shock waves through the world of computing. Other innovations in the works are more down-to-earth, but they also carry watershed significance. From the technologies that finally makepaperless offices a reality to those that deliver wireless power, these advances should make your humble PC a far different beast come the turn of the decade.




Googlebots (also known as robot, bots or spiders) are set of programs to fetch up billions of web pages. Bots are software built to go through every page of a website, categorize them and place them into the Google database. They have their specific algorithms
which determine how many pages to fetch from a website.

Google has three well known bots:

  • The Adsense bot
  • The Freshbot
  • The DeepCrawl.
Adsense bot is generally used for publishers having Adsense on their website. Whenever a new page is created JavaScript within the address code sends a message to the Adsense bot and pages are reviewed by bot within 15 minutes.

Freshbots only use to crawl the most popular pages on a website. Such pages can be one or even thousand. Generally freshbots visit a typical website within 1 to 15 days; it usually depends upon the popularity of the website. Some very popular websites like Amazon.com receives fresh bot crawls in every 10 minute. The credit goes to regular updating and frequent changes in the websites. Freshbot finds all the deeper links in your website and collect those links in the database.

Deepcrawl bot visit a website once per month, and crawl over all the links referenced by the freshbot. Though deep crawl occurs only once in a month it takes up to a month for your entire site to be indexed in Google. Even if you submit Google sitemap your website you have to wait for a deep crawl to occur.

Remember:
Google likes fresh content and if you can get genuine valuable inbound links for your website Google will definitely fall in love with your website. Checkout the complete Google webmasters guidelines to generate more traffic for your website and to rank higher in
Google with a better page rank.

Most of us are crazy about ranking our websites on Google but are not aware about the Google methodology of indexing a website. Google is like a large book with an organized index,ready to locate whatever we want. So let it know first, how Google, the great indexes a particular website. How it find web pages matching visitors query and determine the order of results listed in the search engine displayed results.

There are three major pillars, Google rely on while indexing Web Pages

Bots and Crawlers: Huge set of programs used to fetch or crawl billions of web pages and automatically following all of the links on each web page.

Indexers: A program that analyzes web pages downloaded by the spider and the crawlers.

Servers: For interacting between the user and the search engines.