Saturday, November 29, 2008

Logging In to a Remote Computer in Ubuntu

The most basic type of ssh connection is a remote login. This will give you a command prompt on the remote computer, as if you had just sat down in front of it and opened GNOME Terminal. But before you can log in to any machine via ssh, you’ll need to be sure the remote computer is able to accept ssh connections. This means that it needs to be running the ssh server program (referred to as a service), and also that its firewall has an open port for incoming connections.

The two major components of OpenSSH are the client and server. Most distributions install both items and run the server component all the time. However, only the client component of SSH is installed under Ubuntu. To install the server component, and therefore access your Ubuntu system remotely, you’ll need to open Synaptic Package Manager (System -> Administration) and search for openssh-server. Click to install it. Configuration will be automatic, although if you’re using the Ubuntu firewall, you will need to configure an incoming rule to open port 22.

Initiating an ssh remote shell session with a remote machine is usually achieved by typing something similar to the following at a command prompt on the local machine:

ssh @

In other words, you specify the username you want to log in as, as well as the IP address of the machine. If there’s a fully qualified domain name (FQDN) for the system you want to access, you could specify that instead of the IP address.

You’ll be prompted for your password which, obviously, is the password for the account you’re trying to log in to on the remote computer.

After confirming that you want to make the connection, you’ll be invited to enter the password for the user account under which you initiated the ssh connection. Once this is done, you should find yourself with a shell login on the remote computer. You can run the same commands as usual and perform identical tasks.

The machine you’re logged in to will show no symptoms of being used remotely. This isn’t like the movies, where what you type on your local machine is somehow mirrored on the remote machine for all to see. However, obviously, if a user of the remote machine were to view her network connections using something similar to the netstat command, then she would see another computer attached via ssh. To end an ssh session, simply type exit. This will then return you to the command prompt on your own machine.


Managing Remote Sessions
Whenever you open any kind of shell to enter commands and run programs, you might have noticed that any commands you start running last only as long as the shell window is open. When the shell window is closed, any task running within it ends, too. This is because the shell is seen as the “owner” of the process, and when the owner dies, any processes it started also die. When using ssh to start a remote shell session, this also applies. Whenever you log out, any tasks you were running are ended. This can be annoying if, for example, you’ve started a lengthy download on the remote machine. Effectively, you must remain logged in via ssh until the download has finished. To get around this, you can use the handy screen program. This isn’t specifically designed to be an aid to remote logins, but there’s no reason why it cannot be used in such a situation. The screen program effectively starts shell sessions that stick around, even if the shell window is closed or the ssh connection is ended or lost. After logging in to the remote computer via ssh, you can start a screen session by simply typing the program name at the prompt:

screen

After pressing the spacebar as prompted to start the program, there won’t be any indication that you’re running a screen session. There’s no taskbar at the bottom of the terminal window, for example. Screen works completely in the background. Let’s consider what happens when you detach and then reattach to a screen session. To detach from the screen session, press Ctrl+A and then Ctrl+D. You’ll then be returned to the standard shell and, in fact, you could now disconnect from your ssh session as usual. However, the screen session will still be running in the background on the remote computer. To prove this, you could log back in, and then type this:

screen -r

This will resume your screen session, and you should be able to pick up quite literally where you left off; any output from previous commands will be displayed. To quit a screen session, you can either type exit from within it or press Ctrl+A, and then Ctrl+\ (backslash). The screen program is very powerful. To learn more about it, read its man page. To see a list of its keyboard commands, press Ctrl+A, and then type a question mark (?) while screen is running.

Source of Information : Beginning Ubuntu Linux - From Novice To Professional

Wednesday, November 26, 2008

Freeing Up Disk Space in Ubuntu

After using Ubuntu for some time, you might find that the disk begins to get full. You can keep an eye on disk usage by using the following command in a terminal window:

df -h

This will show the free space in terms of megabytes or gigabytes for each partition, also expressed as a percentage figure. If the disk does start to get full, you can take steps to make more space available.


Emptying the /tmp Folder
An easy way to regain disk space is to empty the /tmp folder. Like its counterpart in the Windows operating system, this is the folder in which temporary data is stored. Some applications clean up after themselves, but others don’t, leaving behind many megabytes of detritus.

Because the /tmp folder is accessed practically every second the system is up and running, to empty it safely, it’s necessary to switch to run level 1. This ensures few other programs are running, and avoids the risk of deleting data that is in use. First, switch to the text console by pressing Ctrl+Alt+F1. Then enter these commands to switch to run level 1:

sudo killall gdm
sudo init 1

A recovery menu will appear. Select the Drop to Root Shell Prompt option. Then enter the following to empty the /tmp folder and reboot:

rm -rf /tmp/*
reboot

On a similar theme, don’t forget to empty the desktop Trash. This can hold many megabytes of old data. If you see an error message about permissions when emptying the Trash, you can do so manually from a terminal window. Simply type sudo rm -rf ~/.Trash/* to get the job done.


Emptying the Cache of Package Files
You might also choose to clear out the Advanced Packaging Tool (APT) cache of old .deb package files. On a system that has been very frequently updated, this can free many megabytes (possibly gigabytes) of space. You can empty the cache by typing the following command in a terminal window:

sudo rm -f /var/cache/apt/archives/*.deb

However, the apt-get clean and apt-get autoclean commands are considered a much safer way to remove unwanted package files.

If you want to restore any packages later on, simply locate them in the Synaptic Package Manager list, click the check box, and click Mark for Reinstallation. This will cause the package to be downloaded, installed, and configured.

Be careful to type the command to empty the APT cache exactly as it’s written. Even inserting an additional space can lead to very bad consequences!


Removing Unused Software
If you still need disk space, consider uninstalling unused programs. As you’ve learned, you can manage software through the Synaptic Package Manager (System -> Administration -> Synaptic Package Manager).

To remove a package, click its check box and select Mark for Removal. However, it’s not a good idea to simply scroll down the list and remove anything that seems dispensable.
Because of the way Linux works, many seemingly insignificant packages are actually vital to the running of the system. Instead, it’s a better idea to look for programs on the Applications menu, and then return to the Synaptic Package Manager to remove them by searching for their names.

As always, removing software can create dependency problems, so you might find yourself limited in what software you can actually remove.

If you want to remove all the desktop games, simply search for gnome-games and gnome-games-data in the Synaptic Package Manager, and mark them for removal.

Source of Information : Apress Beginning Ubuntu Linux

Monday, November 24, 2008

Ubuntu Prelinking

A lot of Ubuntu software relies on other pieces of code to work. These are sometimes referred to as libraries, which is a good indicator of their purpose: to provide functions that programs can check in and out whenever they need them, as if they were borrowing books from a library.

Whenever a program starts, it must look for these other libraries and load them into memory, so they’re ready for use. This can take some time, particularly with larger and more complicated programs. Because of this, the concept of prelinking was invented. By a series of complicated tricks, the prelink program makes each bit of software you might run aware of the libraries it needs, so that memory can be better allocated.

Prelinking claims to boost program startup times by up to 50% or more, but the problem is that it’s a hack—a programming trick designed to make your system work in a nonstandard way. Because of this, some programs are incompatible with prelinking. In fact, some might simply refuse to work unless prelinking is deactivated. At the time of this writing, such programs are in the minority. However, keep in mind that prelinking can be easily reversed if necessary. Alternatively, you might want to weigh whether it’s actually worth setting up prelinking in the first place.


Configuring Prelinking
If you decide to go ahead with prelinking, you’ll need to download the relevant software from the Ubuntu software repositories. Open the Synaptic Package Manager (System ->
Administration -> Synaptic Package Manager), click the Search button, and type prelink into the search box. Mark prelink for installation, and then click Apply.

Before you can run a prelinking sweep of your system, you need to enable it in one of its configuration files. To do this, type the following in a terminal window:

gksu gedit /etc/default/prelink

Change the line that reads PRELINKING=unknown to PRELINKING=yes. Then save the file and quit Gedit.

To run a prelinking scan of your system, simply issue this command:

sudo prelink -a

This will prelink practically all the binary files on your system, and may take some time to complete. You may also see some error output, but you don’t need to pay attention to it. Prelinking was automatically added as a daily cron job when you installed it, so any new programs you add will be automatically prelinked.


Deactivating Prelinking
Should you find prelinking makes a particular application malfunction or simply stop working, you can try undoing prelinking. To do this, find out where the main binary for the program resides, and issue the prelink command with the --undo command option. For example, to remove prelinking from the Gedit text editor program, you could type the following:

whereis gedit

This command will show that the gedit binary is found at the location /usr/bin/gedit in the file system. Next, attempt to undo prelinking on the binary:

sudo prelink --undo /usr/bin/gedit

However, this may not work, because some programs might rely on additional binaries on the system. Therefore, the solution might be to undo prelinking for the entire system, which you can do by typing the following:

sudo prelink -ua

After this, you should remove the prelink package, via the Synaptic Package Manager, to stop it from running again in the future (or manually remove its cron entry).

Source of Information : Apress Beginning Ubuntu Linux 3rd Edition

Wednesday, November 19, 2008

Automating Ubuntu Installation with Kickstart

Kickstart is a method for providing a predetermined installation configuration for installing Ubuntu. Instead of having a user enter responses on the install screens, the responses can be listed in a kickstart file from which the install process can read. You will need to create a kickstart configuration file on a working Ubuntu system. (Kickstart configuration files have the extension .cfg.) A kickstart file is created for every Ubuntu system that holds the install responses used for that installation. It is located in the root directory at /root/anaconda-ks.cfg.

If you plan to perform the same kind of installation on computers that would be configured in the same way—such as on a local network with hosts that have the same hardware—you could use this kickstart file as a model for performing installations. It is a text file that you can edit, with entries for each install response, such as the following for keyboard and time zone:

keyboard us
timezone America/LosAngeles

More complex responses may take options such as network, which uses --device for the device interface and bootproto for the boot client:

network --device eth0 --bootproto dhcp

Display configuration is more complex, specifying a video card and monitor type, which could vary. You can have the system skip this by using xskip. The first entry is the install source. This will be cdrom for a CD/DVD-ROM install. If you want to use an NFS or Web install instead, you could add that information here, specifying the server name or Web site.

You can also use the system-config-kickstart tool to create your kickstart file. This provides a graphical interface for each install screen. First install the tool. Then, to start it, choose Applications | System Tools | Kickstart. The help manual provides a detailed description on how to use this tool.

The name of the configuration file should be ks.cfg. Once you have created the kickstart file, you can copy it to CD/DVD or even to a floppy disk. You could also place the file on a local hard disk partition (such as a Windows or Linux partition) if you have one. For a network, you could place the file on an NFS server, provided your network is running a DHCP server to enable automatic network configuration on the install computer. When you start the installation, at the boot prompt you specify the kickstart file and its location. In the following example, the kickstart file is located on a floppy disk as /dev/fd0:

linux ks=floppy

You can use hd:device to specify a particular device such as a hard drive or second CD-ROM drive. For an NFS site, you would use nfs:.

Source of Information : McGraw Hill Ubuntu The Complete Reference

Monday, November 17, 2008

Web Servers and Apache

The flip side of a Web browser is the Web server, the application that actually locates and delivers content from a specified URI to the browser. What does a Web server have to do? At the most basic level, it simply has to deliver HTML and other content in response to incoming requests. However, to be useful in a modern Web-oriented environment, a Web server has to do several things. The most important of these are the following:

• Be flexible and configurable to make it easy to add new capabilities, Web sites, and support increasing demand without recompilation and/or reinstallation.

• Support authentication to limit users who can access specific pages and Web sites.

• Support applications that dynamically generate Web pages, such as Perl and PHP, to support a customizable and personal user experience.

• Maintain logs that can track requests for various pages so that you can both identify problems and figure out the popularity of various pages and Web sites.

• Support encrypted communications between the browser and server, to guarantee and validate the security of those communications.

The order of importance of these various requirements depends on whether you are a systems administrator or e-commerce merchant, but all modern Web servers must provide at least these capabilities. Many different Web servers are available today, depending on your hardware platform, the software requirements of third-party software that a Web site depends on, your fealty to a particular operating system vendor, and whether or not you are willing to run open source software, get additional power, and save money.

As you might expect, the first Web server in the world went online at CERN, along with the first Web browser. These were written and ran on NeXT workstations, not exactly the world’s most popular platform (sadly enough). The first test of a Web server outside of Europe was made using a server running at the Stanford Linear Accelerator Center (SLAC) in the United States.

The development focus of Web servers that ran on more popular machines was initially the NCSA (National Center for Supercomputing Applications) Web server, known NCSA httpd (HTTP Daemon). Their development of a freely available Web server paralleled their development of the NCSA browser, known as Mosaic. When one of the primary developers of NCSA httpd (Rob McCool) left the NCSA, a group of NCSA httpd fans, maintainers, and developers formed to maintain and support a set of patches for NCSA httpd. This patched server eventually came to be known as the Apache Web server. Though the official Apache Web site used to claim that the name “Apache” was chosen because of their respect for the endurance and fighting skills of the Apache Indians, most people think that this was a joke, and that the name was chosen because the Web server initially consisted of many patches—in other words, it was “a patchy Web server.”

Two Apache servers are available, contained in the packages apache and apache2. The primary differences between these two versions of the Apache Web server are their code base, their vintage, and how you install and maintain them. The apache package is the latest and greatest version of the Apache 1.x family of Web servers, which was excellent in its day, is still extremely popular, and is still in use in many Web sites across the Net. However, the apache2 package contains the latest and greatest version of the Apache 2.x Web server, which is essentially “Apache, the Next Generation.” Though things work differently in Apache 2.x, especially from a system administrator’s point of view, Apache 2.x is a far superior Web server and where future Apache extension development is going to take place.

Today, Apache Web servers installed at sites across the Internet deliver more Web content than any other Web server. I forget the name of the second most popular Web server, but it only runs on a single operating system (which is not Linux) and therefore loses conceptually as well as numerically.

Source of Information : Ubuntu Linux - Bible

Friday, November 14, 2008

World Wide Web

In 1989, what has become the World Wide Web first entered the world in the mind of Tim Berners-Lee at CERN (Conseil EuropĂ©enne pour la Recherche Nucleaire), the European Laboratory for Particle Physics near Geneva, Switzerland. The term World Wide Web wasn’t actually coined until 1990, when Tim BernersLee and Robert Cailliau submitted an official project proposal for developing the World Wide Web. The suggested a new way of sharing information between researchers at CERN who used different types of terminals and workstations. The unique aspect of their information sharing model was that the servers would host information and deliver it to clients in a device-independent form, and it would be the responsibility of each client to display (officially known as render) that information. Web clients and servers would communicate using a language (protocol) known as HTTP, which stands for the HyperText Transfer Protocol.

Hypertext is just text with embedded links to other text in it. The most common examples of hypertext outside of the World Wide Web are various types of online help files, where you navigate from one help topic to another by clicking on keywords or other highlighted text. The most basic form of hypertext used on the Web is HTML, the HyperText Markup Language.

On the World Wide Web, the servers are Web servers and the clients are typically browsers, such as Firefox, Opera, SeaMonkey, Netscape, Microsoft Internet Explorer, Apple’s Safari, and many others, running on your machine. To retrieve a Web page or other Web resource, you enter its address as a Uniform Resource Identifier (URI) in your browser by either typing it in or clicking on a link that contains a reference to that URI. Your browser contacts the appropriate Web server, which uses that URI to locate the resource that you requested and returns that resource as a stream of hypertext information that your browser displays appropriately, and you’re off and running!

Today’s browsers can understand many protocols beyond HTTP, including FTP (File Transfer Protocol, used to send and receive files), file (used to deliver plain-text files), POP (Post Office Protocol, used to send and receive electronic mail), and NNTP (Network News Transfer Protocol, used to send and receive Usenet News postings). Which protocol you use to retrieve a specific Web resource is encoded into the URI, and is referred to as a scheme in Web nerd terms. A URI specifies three basic things:

scheme://host/pathname

The scheme is one of http, ftp, file, and many more, and specifies how to contact the server running on host, which the Web server then uses to determine how to act on your request. The pathname is an optional part of the URI that identifies a location used by the server to locate or generate information to return to you. Web pages consist of a static or dynamically generated text document that can contain text, links to other Web pages or sites, embedded graphics in a variety of formats, references to included documents such as style sheets, and much more. These text documents are created using a structured markup language called HTML, the HyperText Markup Language. A structured markup language is a markup language that enforces a certain hierarchy where different elements of the document can appear only in certain contexts. Using a structured markup language can be useful to guarantee that, for example, a heading can never appear in the middle of a paragraph. Like documents in other modern markup languages, HTML documents consist of logical elements that identify the type of each element—it is the browser’s responsibility to identify each element and determine how to display (render) it. Using a device-independent markup language simplifies developing tools that render Web pages in different ways, convert the information in Web pages to other structured formats (and vice versa), and so on.

“Web addresses.” URL (Uniform Resource Locator) is the traditional acronym and term for a Web address, but the acronym and term URI (Uniform Resource Identifier) is actually more technically correct. Another acronym and term that you may come across is URN (Universal Resource Name). The relationship between these acronyms is the following: a URI is any way to identify a Web resource. A URL is a URI that explicitly provides the location of a resource and the protocol used to retrieve it. A URN is a URI that simply provides the name of a resource, and may or may not tell you how to retrieve it or where it is located.

Source of Information : Ubuntu Linux - Bible

Thursday, November 13, 2008

Ubuntu Help and Documentation

A great deal of help and documentation is available online for Ubuntu, ranging from detailed install procedures to beginners questions. The two major sites for documentation are https://help.ubuntu.com and the Ubuntu forums at www.ubuntuformus.org. In addition, you can consult many blog and news sites as well as the standard Linux documentation. Als helpful is the Ubuntu Guide Wiki at http://ubuntuguide.org. Links to Ubuntu documentation, support, blogs, and news are listed at www.ubuntu.com/community. Here you will also find links for the Ubuntu community structure, including the code of conduct. A “Contribute” section links to sites where you can make contributions in development, artwork, documentation, and support. For mailing lists, check http://lists.ubuntu.com. You’ll find lists for categories such as Ubuntu announcements, community support for specific editions, and development for areas such as the desktop, servers, or mobile implementation.


help.ubuntu.com
Ubuntu-specific documentation is available at help.ubuntu.com. Here on tabbed pages you can find specific documentation for different releases. Always check the release help page first for documentation, though it may be sparse and covers mainly changed areas. The Ubuntu LTS release usually includes desktop, installation, and server guides. The guides are complete and cover most topics. For 8.04, use of Desktop Documentation section will cover key desktop topics like software management, music and video applications, Internet application including mail and instant messaging, security topics, and a guide for new users. The short-term support releases tend to have just a few detailed documentation topics such as software management, desktop customization, security, multimedia and Internet applications, and printing. These will vary depending on what new features are included in the release. One of the most helpful pages is the Community Contributed Documentation page. Here you will find detailed documentation on installation of all Ubuntu releases, using the desktop, installing software, and configuring devices. Always check the page for your Ubuntu release first. The page includes these main sections:

• Getting Help Links to documentation and FAQs. The official documentation link displays the tabbed page for that release on help.ubuntu.com.

• Getting Ubuntu Link to Install page with sections on desktop, server, and alternate installations. Also information on how to move from using other operating systems such as Windows or Mac.

• Using and Customizing Your System Sections on managing and installing software, Internet access, configuring multimedia applications, setting up accessibility, the desktop appearance (eye candy), server configuration, and development tools (programming).

• Maintain Your Computer Links to System Administration, Security, and Troubleshooting Guides pages. System Administration covers topics such as adding users, configuring the GRUB boot loader, setting the time and date, and installing software. The Security page covers lower level issues such as iptables for firewalls and how GPG security works. Of particular interest is the Linux Unified Key Setup (LUKS)–encrypted file system how-tos.

• Connecting and Configuring Hardware Links to pages on drives and partitions, input devices, wireless configuration, printers, sound, video, and laptops.


ubuntuforums.org
Ubuntu Forums provides detailed online support and discussion for users. An Absolute Beginner section provides an area where new users can obtain answers to questions. Sticky threads includes both quick and complete guides to installation for the current Ubuntu release. You can use the search feature to find discussions on your topic of interest. The main support categories section covers specific support areas such as networking, multimedia, laptops, security, and 64-bit support. Other community discussions cover ongoing work such as virtualization, art and design, gaming, education and science, Wine, assistive technology, and even testimonials. Here you will also find community announcements and news. Of particular interest are third-party projects that include projects such as Mythbuntu (MythTV on Ubuntu), Ubuntu Podcast forum, Ubuntu Women, and Ubuntu Gamers. The forum community discussion is where you talk about anything else. The ubuntuforums.org site also provides a gallery page for posted screenshots as well as RSS feeds for specific forums.


ubuntuguide.org
The Ubuntu Guide is a kind of all-purpose how-to for frequently asked questions. It is independent of the official Ubuntu site and can deal with topics such as how to get DVD-video to work. Areas cover such topics as popular add-on applications such as Flash, Adobe Reader, and MPlayer. The Hardware section deals with specific hardware such as Nvidia drivers and Logitech mice. Emulators such as Wine and VMWare are also discussed.


Ubuntu News and Blog Sites
Several news and blog sites are accessible from the News pop-up menu on the www.ubuntu.com site.
• fridge.ubuntu.com The Fridge site lists the latest news and developments for Ubuntu. It features the Weekly Newsletter, latest announcements, and upcoming events.

• planet.ubuntu.com Ubuntu blog for members and developers.

• blog.canonical.com Canonical news.


Linux Documentation
Linux documentation has also been developed over the Internet. Much of the documentation currently available for Linux can be downloaded from Internet FTP sites. A special Linux project called the Linux Documentation Project (LDP), headed by Matt Welsh, has developed a complete set of Linux manuals. The documentation is available at the LDP home site at www.tldp.org. The Linux documentation for your installed software will be available at your /usr/share/doc directory.

Source of Information : McGraw Hill Ubuntu The Complete Reference

Monday, November 10, 2008

Ubuntu Server Virtualization Solutions

Many solutions are currently available to work with virtualization, but three of them are particularly important:
• VMware
• Xen
• KVM (Kernel-based Virtual Machine)

As for the other solutions, you won’t often find them in a data center because of their considerable limitations, which include a lack of support, a limited selection of operating systems that can be installed as virtual machines, and a severe performance penalty when using them. For these reasons, I’ll ignore them here, except for one. If you are interested in running Ubuntu Server in a virtualized environment from a desktop, you should consider installing VirtualBox, which offers an excellent virtualization solution that runs from a graphical desktop.

Of the three important technologies, VMware is the current market leader. It offers a commercial solution to virtualize many different operating systems and is a well established virtualization technology that has been available for more than 10 years. The most important VMware version in the data center is VMware ESX. You can use ESX as a virtualization host, on which you will install virtualized machines. ESX is made of a tuned Linux kernel that integrates the virtual machine manager, which is the process responsible for virtualization.

However, if you want to use VMware ESX as a virtualization platform, you’ll have to do it by running Ubuntu Server as a virtualized “guest” operating system within the VMware environment. There’s currently no way to combine VMware ESX and Ubuntu Server as a virtualization “host” platform (and there will never be such a method). VMware ESX is a proprietary, welltuned operating system environment that has virtualization as its only purpose. There is no need to replace it with anything open source.

The other important player in the field of virtualization is KVM, which offers virtualization support in the Linux kernel itself. KVM is currently the default virtualization technology used in Ubuntu Server. Other Linux vendors such as Red Hat also embrace it as their default. To use it, you’ll need the kvm.ko kernel module for Linux, a CPU that has built-in virtualization support, and of course a kernel that supports KVM virtualization. (The 2.6.20 kernel is the first Linux kernel to do this.) To create virtual machines with KVM, you’ll use the /dev/kvm interface, and this functionality requires a modified version of the QEMU program.

QEMU was originally developed as an open-source emulation product, but it never became very successful in the data center. And even though it was developed to be used as a virtualization solution, it never really made it. QEMU tools are still very useful, however, and QEMU tools and solutions are used in both KVM and Xen environments. Currently, most operating systems are supported on a KVM virtual host, provided that the operating system runs on the same processor architecture.

The other major player in the Linux virtualization market is Xen, which began as a research project at the University of Cambridge (see http://www.cl.cam.ac.uk/research/srg/netos/xen). Its core component is its hypervisor, the layer that makes it possible to create virtual machines and to handle instructions generated by those virtual machines. When used on a virtual machine host, the hypervisor replaces the normal Linux kernel, which is loaded only after the Xen hypervisor. Xen is currently one of the best virtualization platforms available on Linux, mainly because of its strong developer community, which includes hardware vendors such as Intel, HP, and AMD; and software vendors such as Novell and Red Hat. You can use Xen on Ubuntu Server as well, but the default virtualization stack is KVM.

Emulation means that software is used to simulate a hardware platform. An example is when you run a Sega Megadrive/Genesis emulator on your PC to run old games; the software runs all CPU instructions like the hardware does. The emulator behaves just like a software processor, a pure software virtual machine. You can run an i386 operating system on an i386-based CPU in two ways. First, you can use an i386 software emulator running on i386 (examples are Bochs and QEMU). In such a solution, the software behaves like a PC reproducing the complete hardware platform. Second, you can use a virtualization solution such as VMware Workstation. The virtualization solution does not provide a virtual CPU or any virtual base component of the basic PC hardware (IRQs controllers, hardware clock, and so on); it just puts the program to be virtualized on the real CPU and lets it execute the code. That solution needs complete hardware control, which is why it needs to run on privileged mode of the CPU and provides kernel modules for Linux to run in the kernel. Virtualization is not a next generation of emulation; it’s a different way of executing an operating system.

Source of Information : Apress Beginning Ubuntu LTS Server Administration

Saturday, November 8, 2008

Choosing Your Shell

In most Linux systems, your default shell is the bash shell. To find out what your current login shell is, type the following command:

$ echo $SHELL
/bin/bash

In this example, it’s the bash shell. There are many other shells, and you can activate a different one by simply typing the new shell’s command (ksh, tcsh, csh, sh, bash, and so forth) from the current shell. For example, to change temporarily to the C shell, type the following command:

$ csh


You might want to choose a different shell to use because:

• You are used to using UNIX System V systems (often ksh by default) or Sun Microsystems and other Berkeley UNIX–based distributions (frequently csh by default), and you are more comfortable using default shells from those environments.

• You want to run shell scripts that were created for a particular shell environment, and you need to run the shell for which they were made so you can test or use those scripts.

• You might simply prefer features in one shell over those in another. For example, a member of my Linux Users Group prefers ksh over bash because he doesn’t like the way aliases are always set up with bash.

Although most Linux users have a preference for one shell or another, when you know how to use one shell, you can quickly learn any of the others by occasionally referring to the shell’s man page (for example, type man bash). Most people use bash just because they don’t have a particular reason for using a different shell.


Using bash (and Earlier sh) Shells
As mentioned earlier, the name bash is an acronym for Bourne Again Shell, acknowledging the roots of bash coming from the Bourne shell (sh command) created by Steve Bourne at AT&T Bell Labs. Brian Fox of the Free Software Foundation created bash, under the auspices of the GNU Project. Development was later taken over by Chet Ramey at Case Western Reserve University.

Bash includes features originally developed for sh and ksh shells in early UNIX systems, as well as some csh features. Expect bash to be the default shell in whatever Linux system you are using, with the exception of some specialized Linux systems (such as those run on embedded devices or run from a floppy disk) that may require a smaller shell that needs less memory and entails fewer features.

Bash can be run in various compatibility modes so that it behaves like different shells. It can be run to behave as a Bourne shell (bash +B) or as a POSIX-compliant shell (type bash --posix), for example, enabling it to read configuration files that are specific to those shells and run initialization shell scripts written directly for those shells, with a greater chance of success.


Using tcsh (and Earlier csh) Shells
The tcsh shell is the open source version of the C shell (csh). The csh shell was created by Bill Joy and used with most Berkeley UNIX systems (such as those produced by Sun Microsystems) as the default shell. Features from the TENEX and TOPS-20 operating systems (used on PDP-11s in the 1970s) that are included in this shell are responsible for the T in tcsh. Many features of the original csh shell, such as command-line editing and its history mechanism, are included in tcsh as well as in other shells. While you can run both csh and tcsh on most Linux systems, both commands actually point to the same executable file. In other words, starting csh actually runs the tcsh shell in csh compatibility mode.


Using ash
The ash shell is a lightweight version of the Berkeley UNIX sh shell. It doesn’t include many of the sh shell’s basic features, and is missing such features as command histories. Kenneth Almquist created the ash shell. The ash shell is a good shell for embedded systems that have fewer system resources available. The ash shell is about one-seventh the size of bash (about 100K versus 712K for bash). Because of cheaper memory prices these days, however, many embedded and small bootable Linux systems have enough space to include the full bash shell.


Using ksh
The ksh shell was created by David Korn at AT&T Bell Labs and is the successor to the sh shell. It became the default and most commonly used shell with UNIX System V systems. The open source version of ksh was originally available in many rpm-based systems (such as Fedora and Red Hat Enterprise Linux) as part of the pdksh package. Now, however, David Korn has released the original ksh shell as open source, so you can look for it as part of a ksh software package in most Linux systems (see www.kornshell.com).


Using zsh
The zsh shell is another clone of the sh shell. It is POSIX-compliant (as is bash), but includes some different features, such as spell checking and a different approach to command editing. The first Mac OS X systems used zsh as the default shell, although now bash is used by default.

Source of Information : Linux Bible 2008 Edition

Friday, November 7, 2008

Ubuntu Linux Special Characters

Special characters, which have a special meaning to the shell. Avoid accidentally using them as regular characters until you understand how the shell interprets them. For example, it is best to avoid using any of the following characters in a filename (even though emacs and some other programs do) because they make the file harder to reference on the command line:

& ; | * ? ' " ‘ [ ] ( ) $ < > { } # / \ ! ~

Whitespace. Although not considered special characters, RETURN, SPACE, and TAB also have special meanings to the shell. RETURN usually ends a command line and initiates execution of a command. The SPACE and TAB characters separate elements on the command line and are collectively known as whitespace or blanks.

Quoting special characters. If you need to use a character that has a special meaning to the shell as a regular character, you can quote (or escape) it. When you quote a special character, you keep the shell from giving it special meaning. The shell treats a quoted special character as a regular character. However, a slash (/) is always a separator in a pathname, even when you quote it.

Backslash. To quote a character, precede it with a backslash (\). When two or more special characters appear together, you must precede each with a backslash (for example, you would enter ** as \*\*). You can quote a backslash just as you would quote any other special character—by preceding it with a backslash ( \\).

Single quotation marks. Another way of quoting special characters is to enclose them between single quotation marks: '**'. You can quote many special and regular characters between a pair of single quotation marks: 'This is a special character: >'. The regular characters are interpreted as usual, and the shell also interprets the special characters as regular characters.

The only way to quote the erase character (CONTROL-H), the line kill character (CONTROL-U), and other control characters (try CONTROL-M) is by preceding each with a CONTROL-V. Single quotation marks and backslashes do not work. Try the following:

$ echo 'xxxxxxCONTROL-U'
$ echo xxxxxxCONTROL-V CONTROL-U

Source of Information : A Practical Guide to Ubuntu Linux

Thursday, November 6, 2008

Ubuntu LiveCD

The standard Ubuntu Desktop CD and the Install DVD can both operate as LiveCDs (server and alternate editions do not), so you can run Ubuntu from any CD-ROM drive. In effect, you can carry your operating system with you on a CD. New users can also use the LiveCD to try out Ubuntu to see if they like it. The Ubuntu Desktop CD will run as a LiveCD automatically, and with the Install DVD it is a start-up option. Both run GNOME as the desktop. If you want to use the KDE instead, you would use the Kubuntu CD.

Keep in mind that all the LiveCDs also function as install discs for Ubuntu, providing its limited collection of software on a system, but installing a full-fledged Ubuntu operating system that can be expanded and updated from Ubuntu online repositories. From the LiveCD desktop, double-click the Install icon on the desktop to start the installation. The LiveCD provided by Ubuntu includes a limited set of software packages. On the Ubuntu Desktop CD, you use GNOME for desktop support. Other than these limitations, you’ll have a fully operational Ubuntu desktop. You have the full set of administrative tools, with which you can add users, change configuration settings, and even add software, while the LiveCD is running. When you shut down, the configuration information is lost, including any software you have added. Files and data can be written to removable devices such as USB drives and CD/DVD write discs, letting you save your data during a LiveCD session.

When you start up the Ubuntu Desktop CD, the GNOME desktop is automatically displayed and you are logged in as the live session user. The top panel displays menus and application icons for a Web browser (Firefox) and mail. To the right is a network connection icon for NetworkManager, which you can configure (with a right-click) for wireless access. At the right side of the top panel is a Quit button you can click to shut down your system. It is important to use the Quit button to unmount any removable devices safely.

An icon is displayed for an Examples directory. Click it to access example files for OpenOffice.org (Productivity), Ogg video and SPX/Ogg sound files (Multimedia), and Gimp XCF and image PNG files (Graphics). OpenOffice.org files begin with the prefix oo- and include word processing (odt), spreadsheets (xls), presentation (odp), and drawing (odg) files. Check the oo-welcome.odt file for information about Ubuntu and the oo-aboutthese-files.odt file for information about the example files. Also included are PNG image files of the official Ubuntu logos for Ubuntu, Kubuntu, and Edubuntu.

The Computer window, accessible from the Places menu, displays icons for all partitions on your current computer. These will be automatically mounted as read-only, including Windows file systems. The File System icon will let you peruse the configuration files, but these are located on a Read-Only File System (ROFS) that you can access but not change. These folders and files will show a lock emblem on their icons.

You can save files to your home directory, but they are only temporary and will disappear at the end of the session. Copy them to a DVD, USB drive, or other removable device to save them. An Install icon lets you install Ubuntu on your computer, performing a standard installation to your hard drive.

Source of Information : McGraw Hill Ubuntu The Complete Reference

Tuesday, November 4, 2008

Ubuntu 8.04 LTR

Ubuntu 8.04 includes the following features:

• PolicyKit authorization for users, allowing limited controlled administrative access to administration tools and storage and media devices.

• Brasero GNOME CD/DVD burner.

• Kernel-based Virtualization Machine (KVM) support is included with the kernel. KVM uses hardware virtualization enabled processors such as Intel Virtualization Technology (VT) and AMD Secure Virtual Machine (SVM) processors to support hardware-level guest operating systems. Most standard Intel and AMD processors already provide this support.

• Use Virtual Machine Manger to manage and install both KVM and Xen virtual machines.

• The java-gcj-compat collection provides Java runtime environment compatibility. It consists of GNU Java runtime (libgcj), the Eclipse Java compiler (ecj), and a set of wrappers and links (java-gcj-compat).

• New applications such as Transmission BitTorrent client, Vinagre Virtual Network Client, and the World Clock applet with weather around the world.

• Features automatic detection of removable devices such as USB printers, digital cameras, and card readers. CD/DVD discs are treated as removable devices, automatically displayed and accessed when inserted.

• GNOME supports GUI access to all removable devices and shared directories on networked hosts, including Windows folders, using the GNOME Virtual File System, gvfs, which replaces gnomevfs.

• Any NTFS Windows file systems on your computer are automatically detected and mounted using ntfs-3g. Mounted file systems are located in the media directory.

• Full IPv6 network protocol support, including automatic addressing and renumbering.

• Network Monitor will automatically detect wireless network connections.

• Information about hotplugged devices is provided to applications with the Hardware Abstraction Layer (HAL) from www.freedesktop.org. This allows desktops such as GNOME to display and manage removable devices easily.

• All devices are treated logically as removable and automatically configured by udev. Fixed devices cannot be removed. This feature is meant to let Linux accommodate a wide variety of devices, such as digital cameras, PDAs, and cell phones. PCMCIA and network devices are managed by udev and HAL directly.

• The Update Manager automatically updates your Ubuntu system and all its installed applications, from the Ubuntu online repositories.

• Software management (Synaptic Package Manager) accesses and installs software directly from all your configured online Ubuntu repositories.

• The current version of OpenOffice.org provides effective and MS Office–competitive applications, featuring support for document storage standards.

• Complete range of system and network administration tools featuring easy-to-use GUI interfaces.

• Wine Windows Compatibility Layer lets you run most popular Windows applications directly on your Ubuntu desktop.

Source of Information : McGraw Hill Ubuntu The Complete Reference

Monday, November 3, 2008

Ubuntu Editions

Ubuntu is released in several editions, each designed for a distinct group of users or functions. Editions install different collections of software such as the KDE, the XFce desktop, servers, educational software, and multimedia applications. ISO images can be downloaded directly or using a BitTorrent application.

Desktop install. LiveCD using GNOME desktop, www.ubuntu.com/getubuntu.

Server install. Install server software (no desktop), www.ubuntu.com/getubuntu.

Alternate install. Install enhanced features, www.releases.ubuntu.com.

Kubuntu. LiveCD using KDE instead of GNOME, www.kubuntu.org.

Xubuntu. Uses the Xfce desktop instead of GNOME, www.xubuntu.org. Useful for
laptops.

Edubuntu. Installs educational software: Desktop, Server, and Server add-on CDs,
www.edubuntu.org.

Goubuntu. Uses only open source software; no access to restricted software of any
kind, www.ubuntu.com.

Ubuntu Studio. Ubuntu Desktop with multimedia and graphics production applications,
www.ubuntustudio.org.

Mythbuntu. Ubuntu Desktop with MythTV multimedia and digital video recorder (DVR)
applications, www.mythbuntu.org.


The Ubuntu Desktop Edition provides standard functionality for end users. The standard Ubuntu release provides a LiveCD using the GNOME desktop. Most users would install this edition. This is the CD image that you download from the Get Ubuntu Download page at www.ubuntu.com/getubuntu/download.

Those who want to run Ubuntu as a server to provide an Internet service such as a Web site would use the Ubuntu Server Edition. The Server Edition provides only a simple command line interface; it does not install the desktop. It is primarily designed to run servers. Keep in mind that you could install the desktop first and later download server software from the Ubuntu repositories, running them from a system that also has a desktop. You do not have to install the Server Edition to install and run servers. The Server Edition can be downloaded from the Get Ubuntu Download page.

Users who want more enhanced operating system features such as RAID arrays or file system encryption would use the alternate edition. The alternate edition, along with the Desktop and Server editions, can be downloaded directly from http://releases.ubuntu.com/hardy or http://releases.ubuntu.com/8.04.

Other editions use either a different desktop or a specialized collection of software for certain groups of users. Links to the editions are listed on the www.ubuntu.com Web page. From there you can download their live/install CDs. The Kubuntu edition used KDE instead of GNOME. Xubuntu uses the XFce desktop instead of GNOME. This is a stripped down and highly efficient desktop, ideal for low power use on laptops and smaller computer. The Edubuntu edition provides educational software and can also be used with a specialized Edubuntu server to provide educational software on a school network. The Goubuntu edition is a modified version of the standard edition that includes only open source software, with no access to commercial software of any kind, including restricted vendor graphics drivers such as those from Nvidia or ATI. Only X.org open source display drivers are used. The Ubuntu Studio edition is a new edition that provides a collection of multimedia and image production software. The Mythbuntu edition is designed to install and run the MythTV software, letting you use Ubuntu to operate as a multimedia DVR and video playback system.

The Kubuntu and Edubuntu editions can be downloaded directly from http://releases.ubuntu.com. The Gobuntu, Mythbuntu, Xubuntu, and Ubuntu Studio are all available, along with all the other editions and Ubuntu releases, on the cdimage server at www.cdimage.ubuntu.com. Keep in mind that all these editions are released as LiveCDs or DVD install discs, for which there are two versions: a 32-bit x86 version and a 64-bit x86_64 version. Older computers may support only a 32-bit version, whereas most current computers will support the 64-bit versions. Check your computer hardware specifications to be sure. The 64-bit version should run faster, and most computer software is now available in stable 64-bit packages. The http://releases.ubuntu.com and www.cdimage.ubuntu.com also hold BitTorrent and Jigdo download files for the editions they provide.

www.ubuntu.com/getubuntu/download. Primary download site for desktop and servers.

http://releases.ubuntu.com/8.04 or www.cdimage.ubuntu.com/releases. Download sites for alternate, server, and desktop CDs and the Install DVD, as well as BitTorrent files

http://release.ubuntu.com. Download site for Ubuntu editions, including Kubuntu and Edubuntu. Also check their respective Web sites.

www.cdimage.ubuntu.com. Download site for all Ubuntu editions, including Xubuntu, Mythbuntu, Goubuntu, and Ubuntu Studio. Also check their respective Web sites. Includes Kubuntu, Ubuntu, and Edubuntu.

https://launchpad.net. Ubuntu mirrors.

http://torrent.ubuntu.com. Ubuntu BitTorrent site for BitTorrent downloads of Ubuntu distribution ISO images.

Source of Information : McGraw Hill Ubuntu The Complete Reference

Sunday, November 2, 2008

Open Source Software

Linux is developed as a cooperative open source effort over the Internet, so no company or institution controls Linux. Software developed for Linux reflects this background.
Development often takes place when Linux users decide to work together on a project. The software is posted at an Internet site, and any Linux user can then access the site and download the software. Linux software development has always operated in an Internet environment and is global in scope, enlisting programmers from around the world. The only thing you need to start a Linux-based software project is a Web site.

Most Linux software is developed as open source software, and the source code for an application is freely distributed along with the application. Programmers over the Internet can make their own contributions to a software package’s development, modifying and correcting the source code, which is included in all its distributions and is freely available on the Internet. Many major software development efforts are also open source projects, as are the KDE and GNOME desktops along with most of their applications. The OpenOffice office suite supported by Sun is an open source project based on the StarOffice office suite (Sun’s commercial version of OpenOffice). You can find more information about the Open Source Initiative at www.opensource.org.

Open source software is protected by public licenses that prevent commercial companies from taking control of the software by adding modifications of their own, copyrighting those changes, and selling the software as their own product. The most popular public license is the GNU GPL, under which Linux is distributed, which is provided by the Free Software Foundation. The GNU GPL retains the copyright, freely licensing the software with the requirement that the software and any modifications made to it are always freely available. Other public licenses have been created to support the demands of different kinds of open source projects. The GNU Lesser General Public License (LGPL) lets commercial applications use GNU-licensed software libraries. The Qt Public License (QPL) lets open source developers use the Qt libraries essential to the KDE desktop. You can find a complete listing at www.opensource.org.

Linux is currently copyrighted under a GNU public license provided by the Free Software Foundation, and it is often referred to as GNU software (see www.gnu.org). GNU software is distributed free, provided it is freely distributed to others. GNU software has proved both reliable and effective. Many of the popular Linux utilities, such as C compilers, shells, and editors, are GNU software applications. Installed with your Linux distribution are the GNU C++ and Lisp compilers, Vi and Emacs editors, BASH and TCSH shells, as well as TeX and Ghostscript document formatters. In addition, many open source software projects are licensed under the GNU GPL. Most of these applications are available on the Ubuntu software repositories. Under the terms of the GNU GPL, the original author retains the copyright, although anyone can modify the software and redistribute it, provided the source code is included, made public, and provided free. Also, no restriction exists on selling the software or giving it away free.

One distributor could charge for the software, while another could provide it free of charge. Major software companies are also providing Linux versions of their most popular applications. Oracle provides a Linux version of its Oracle database. (At present, no plans seem in the works for Microsoft applications, though you can use the Wine, the Windows compatibility layer, to run many Microsoft applications on Linux, directly.)

Source of Information : McGraw Hill Ubuntu The Complete Reference