Tuesday, October 26, 2004
Here is a great article that fits nicely about why I choose Debian over other Linux distributions. My first attempt at linux was with Debian Potato. I was able to install the OS but I couldn't figure out how to get the graphics to work. I was able to login at the prompt but I didn't know what to do in Linux or why I'd want to work in Linux. It certainly didn't seem like it was a reason to switch from Windows.
I started with Debian because the Red Hat website said they wanted money for Linux and I didn't want to spend time downloading something that would have a type of shareware restriction. Debian was one of the only distro's that was free and they had a step by step guide for downloading and installing the OS. Of course, that didn't include an easy graphics installation.
My second distro was Mandrake. This installed a GUI interface without any problems. In fact it seemed modeled after Windows 2000 installations. I was able to finally see what Linux was all about. It had a few games installed but I had a hard time understanding what was so special about Linux. Why were so many people talking about it? I soon found problems with Mandrake. It was easily installed but I quickly broke it probably because of my inexperience with Linux. And I wasn't able to easily update the software because Mandrake also wanted to charge money for their software. It is possible that by spending money on either Red Hat or Mandrake, I would have increased my enjoyment of Linux but I wanted to test-drive Linux and see what all the fuss was about.
I gave up on Linux for 3-4 months. Sure, I had it installed as part of a dual-boot environment at home and as a seperate test computer at work but the software did very little for me. I was able to play a couple games that were fun (Lbreaker) but as far as productivity it hindered more than it helped.
After trying Debian again and toughing out the video driver issue, I was able to understand why Linux was doing so well. (I attempted to install a newer Xserver from the source, which worked but I found out later that all I had to do was work with testing) Linux/ Debian gave me hundreds of quality enterprise software that wasn't even available under Windows, let alone easily found amongst the proprietary crap. I was able to monitor network traffice with EtherApe, use MRTG to easily monitor traffic. I figured out that if I spent time understanding what all the 8,000 packages did, I would be able to learn how to better manage all types of computers.
After a while of being able to create test environments on Debian, I was required to go back to Windows exclusively. I learned quickly that there were some things that were easier to do on Debian that couldn't be done on Windows. I downloaded Cygwin and other Windows ports for GPL software and found that it was inevitably harder to keep software up-to-date than it was with Debian. Under Debian, I can 'apt-get upgrade' to have every software package installed from Debian upgraded without breaking my system.
Now I compare my Windows 2000 machine to Debian unstable and find Windows much more unstable. The Windows hard drive constantly needs hand holding because of fragementing. The Debian machine has never needed defragmentation, never and I have abused the Debian machine more. My computer quickly because a test machine that constantly gets software installed and uninstalled which works out fine under Debian but not under Windows. Granted Microsoft controls only a fraction of the packages installed on it whereas I am usually able to use Debian packages for 99% (unofficial packages like mplayer may take up to 5%), but under Debian the system works after it is installed.
Anyways, I probably won't convince anyone of which OS to use but hopefully I can encourage someone else to try out Debian and fight through the initial difficulties.
Friday, October 22, 2004
Here is a great article explaining the philosophy behind Microsoft versus Open Source philosophies. Microsoft is all about making money with software, which is fine but it doesn't produce great software. I believe Microsoft's greatest asset is finding the balance between having features that users want and having bugs in the software. Users are able to tolerate X number of program failures. Microsoft can always fix software bugs in the next update.
For the most part, people don't understand how software is supposed to work. When they click on a button and it doesn't do anything, they click on the button a second time. They expect a different result from the same input. It is very hard to teach a person how to get around specific software bugs by showing them a different way to do the same result.
a security flaw is just an exploitable bug...Security is not a feature you add to a product. It's not even a process, or a an attitude, or whatever else you thought I was going to say. No, security is an emotion. Computers don't have emotions, people do. Security, to a programmer writing code, is having confidence that his code is correct. To be correct, it must be shown to everyone, including to the bad guys.
The trick to creating software is that it is full of mistakes. It is the mistakes and resolution to those mistakes that drove me to find a different option in Open Source.
Thursday, October 21, 2004
I needed to find a network backup solution that would backup our 6 Windows 2000 servers without much of a budget. After looking at a host of Open Source options I finally found one that just worked, BackupPC. It is elegant in the way that it can save space on the hard drive by using Linux's hard linking. Basically, it only saves one file once and creates hard links whenever another computer uses the same matching MD5 sum. I experienced close to the same savings as listed on thier website.
One example of disk use: 95 latops with each full backup averaging 3.6GB each, and each incremental averaging about 0.3GB. Storing three weekly full backups and six incremental backups per laptop is around 1200GB of raw data, but because of pooling and compression only 150GB is needed.
I installed BackupPC on a test system and it worked like a champ. I was able to very easily see what files were locked from Windows. I am currently having some issues because one of my production database servers has a file over 4 GB which seems to be a Samba limitation.
Another good option was Bacula. Bacula was my first choice because it had a Windows native client and also had plenty of enterprise options for future expandability. I believe that it is more efficient to spend time early on learning about the more complex system and adapting it to a simple task than discovering that you need to impliment a new technology because the current techonology doesn't scale very well.
Wednesday, October 13, 2004
Jeff Sandquist - Microsoft Evangelist - How I get things done / Kicking some butt on your email inbox - Part 1
Jeff Sandquist - Microsoft Evangelist - How I get things done / Kicking some butt on your email inbox - Part 2
I want to see if this system can help me improve my organization skills and give me more opportunity for creativity. I'm currently using Outlook 2000 and having difficulty using it effectively. I am currently using Lookout from http://www.lookoutsoft.com. Lookout is an indexing system that has been purchased by Microsoft. Instead, I would like to work in request tracker3 into the picture to store all tasks.
Monday, October 4, 2004
Monday, September 27, 2004
Debian already has a staging ground for stable, the testing branch. It is easily available for anyone to install and includes a lot of solid software. I think we need to ask what does stable have that testing does not.
One thing missing in testing is security updates. Debian is not committed to publishing security updates for testing. Which is a policy based upon their scarcity of resources. Can we correct this by focusing a group to provide but if this keeps us from using testing all of the time then can we correct it? Is this something that another group of volunteers can help?
Testing is also missing stability which is obvious. However, we have to remember that software will always be untested and unstable until the software is tested, bug reports are documented, and then bugs are fixed.
I'm sure I am missing other reasons why testing looks so tempting but cannot be used in a corporate environment.
Tuesday, September 14, 2004
Here is a quick note for other Debian users that have a Lexmark printer. Install cups, alien and foomatic through apt (apt-get install cups foomatic alien). Download the rpm packages from the above site and use alien on them. You will definately have to restart cups '/etc/init.d/cup restart' for the changes to work. I ended up rebooting the whole computer for the new printer driver to show up in the cups driver menu.
This works really well under unstable.
Saturday, September 11, 2004
I suggested that Debian could try to incorporate the statistics from the popularity-contest package and create a type of Google PageRank (tm) of packages. When you use aptitude or 'apt-cache search ftp' you could see that package foo has a popularity of 80% of all packages with ftp in the title. It could also show that this package got 800 votes. This type of information can tell a Debian user that there is probably stong development of the package currently and in the future. Of course the Debian user will still have the option of going for a less popular package but it would assist in the decision making process.
To know what percentage of the Debian market package foo uses, one would have to seperate the Debian packages into the appropriate market. However, you could use the same search term used in 'apt-cache search ftp' to focus on the packages with ftp in the title. You can take the 800 votes for foo divided by the total number of votes for packages with ftp in the title.
Now to figure out a way to tell the Debian Developers for apt-cache.
Tuesday, August 10, 2004
We had a problem to solve. How to effectively review Windows remote client Event Logs to make sure that our firewall and security policies were working. I learned the importance of reviewing security logs after I first used Windows 2000 TCP/IP filtering to limit access to our remote clients. This worked for most computers but failed whenever a DSL company added a internal modem. Basically the TCP/IP filtering only worked when all traffic used the LAN interface. I didn't like finding out about the security problems after someone was infected by a worm.
I looked into two different tools for polling Windows event logs. One method was using a syslog Windows client to send a UDP message whenever a event log message was generated. I was able to find GPL free license for the Windows client but not for the Windows server piece. However, I could always use a Linux machine for the server.
The biggest advantage of this method was it's use in the Unix/Linux world. I felt comfortable using old tried and true system enterprise logging and I really liked the idea that I could integrate some Linux technology into Windows. The biggest problem was that the security log messages would be pushed to the server whenever there was an issue. Now our remote clients connect to work through a VPN connection and are not always able to connect to the syslogd server. Apparently the messages would be serialized to easily show when information was missing from the server but I expected that the remote client would see that situation 40% of the time. I thought about sending the UDP messages over the Internet through our firewall, but realized that the messages would be unencrypted and able to be sniffed from anyone.
So in order to use syslog with Windows logging I would have to encrypt the UDP connection to a internet exposed server and look into the possibility of caching the syslog messages which was highly not recommended for problems with bandwidth when everybody started their morning shift.
The other option is working quite well for us. I ended up using Event Collector/Monitor from sourceforge because of the GPL license and the fact that it pulled the information from the remote clients. The Event Monitor project using the Windows admin$ share to copy the system log to a centralized server and then uses Perl to parse the contents into a SQL database (MS-SQL or MySQL). It uses a php front end to input the computer names and create search reports. It also has a filter system that is used to email an administrator when certain criteria are met. Currently, I have every warning, error, and security failure emailing my Exchange account which throws the alert into its own folder.
This software worked so well that I have included all of our in-house computers and servers. It was really useful for me to get an email for every error, although I'm starting to get too many emails and will probably create better filters. Now that I understand what errors I'm getting I think the more specific filter will work better than if I didn't get all errors sent to me.
Through this program, I have found out that there are more issues than I remembered with the client computers. I would see errors here and there when manually auditing the event log but I never was able to see patterns over all the computers. I realized because of this software that a software package is continuing to ask for more security privileges than I gave it. It didn't stop the program in any noticable way but now I have something to fix pro-actively.
I installed the Event Collector (Event Collector is the name of the Windows application that polls the clients) on my workstation. I used IIS, MySQL, and cygwin's perl package. The actual event collecting can use quite a bit of the processor but usually only when adding a new client that needs the entire log parsed.
Thursday, July 29, 2004
This is supposed to be a good article on security that can be read from beginning to end.
My only question with the methods described is knowing how to update your computer from Microsoft's . I don't think that Firefox can run ActiveX scripts which is necessary.
Thursday, July 22, 2004
(This is all from memory from what I did last night. I have correctly done this with two systems but I had forgotten during the months between.)
First make sure that you need to do all of these steps. Run 'pppconfig' and create a new connection allow the computer to auto-detect the modem. If the only option is /dev/ttyS0 (the first external serial port) or if you try the other serial ports with no success (ttyS0-4) then you will need to add a new serial port. Look at 'man ttyS' for a mknod line that will start you off. I ran 'mknod -m 660 /dev/ttyS4 c 4 68'. Then 'chown root:tty /dev/ttyS4' to give the port the correct permissions.
Run 'ls /dev/ttyS*' to see what serial devices are installed on your computer. You can remove a serial device and recreate it but be careful. On Knoppix, I saw that ttyS0-3 was different thatn ttyS4-9. I ended up choosing to install the modem under the /dev/ttyS4 port.
Run 'lspci -v' to see what I/O address your computer gave to your internal modem. Find the modem in the list (You can use this command to narrow in on the port 'lspci -v | grep -A 10 -i modem') and look for 'I/O ports at'. Also note the irq listed under the Flags section (mine was irq 11). The serial connection will always start with 0x and have 4 hexidecimal digits after. Mine lcpci line was 'I/O ports at ccf0', therefore use 0xccf0 for the port address.
You can see how the other serial ports are described by running 'setserial /dev/ttyS[0-4]'. This will not change anything and will only show you what settings you want to emulate. Now run 'setserial /dev/ttyS4 uart 16550A port 0xccf0 irq 11'. If you read the 'man setserial' page, you will find out that UART has something to do with the serial protocol -- an infrared port is different than a modem port.
Now the computer should be able to recognize the modem also at /dev/ttyS4 when you run pppconfig and pon. Setserial should remember the settings the next time you reboot because it will save all ports that do not have a uart of unused or none listed. If it didn't work you need to run 'dpkg-reconfigure setserial' to set the setserial to remember after every reboot.
Monday, June 28, 2004
Here is a great website showing how to use Microsoft's IP Security Settings to create an effective firewall. I will be modifying the rules to include a firewall for all ingress Internet traffic and both ingress and egress traffic through a PPTP tunnel.
I had to permit TCP traffic on port 1723 for PPTP and permit Other 67 (GRE protocol). I then had to create a trusted network that allowed all traffic through.
I found this article that states the problems using Windows IPSec as a firewall tool. http://sinbad.zhoubin.com/read.html?board=Win&num=110
This Microsoft KB article will show how to disable the vulnerability through the registry. http://support.microsoft.com/default.aspx?scid=kb;en-us;811832
Thursday, June 24, 2004
I've been doing some reading on rsync and it is a really neat utility. I would like to use the tool to help move 8 Windows 2000 server backups across our network to a centralized backup server with a tape drive.
I looked into Arkeia (nonGPL), Bacula, and Armanda. Bacula and Arkeia have Windows clients, however Bacula can't backup a Windows registry. I would need to run ntbackup and then run bacula. Arkeia seemed to work great but I started getting concerned about how much stress I was adding to our network and it is a lot more expensive than I was anticipating ($3000-6000 depending how many servers have clients).
That is when I found out about rsync and BackupPC (GPL). Rsync has a great alogorithm that allows it to 'cheat' a file transfer by only sending the binary differences between computers (You can read about it at http://samba.anu.edu.au/rsync/). This sounds like just the type of tool I need to use to create good network backups.
Now for the test: see if I can use rsync to speed up apt-get updates. After running 'apt-cache search rsync', I found out that apt-proxy can use the rsync protocol. However, the Readme.gz file mentions that rsync doesn't work well with compressed files and this article brings up the same concerns.
Since it is easier to try and fail than to spend any more time searching, my next post will show my results.
I have to stop using my Debian machine at work. Apparently, the ntlmaps package doesn't play well with Windows ISA (MS Proxy) policies or you could say that it works too good. I was sucking up all available Internet bandwidth constantly whenever I ran apt-get upgrade.
I am running SID from work as my experimental server/workstation. Whenever I needed a new network or administration tool, I would search for a tool and apt-get install that tool. It was great. I installed plone when I thought our intranet looked boring, I installed Request-Tracker to see if I could replace Track-It!, I even started running mrtg to monitor our network health on the Debian machine.
I was asked to stop using the Debian server on the network to see if that helped with our Internet bandwidth issue. Apparently it helped.
I think I figured out why I was getting priority when downloading web pages and files. If I downloaded a file through the ntlmaps proxy, I would get around 115 kbps (920 Kbps) downloads and if I downloaded through MS Proxy directly, I would get slowed down by policy to 25 kbps (200 Kbps).
Since, I was stuck on Windows 2000 for everything, I decided to download cygwin to use some of the tools I was used to in Debian. When downloading the packages without going through Internet Explorer my speed increased to the high 115 kbps speed. The same thing happens if you use any program other that IE to download anything. Since ntlmaps wasn't IE there was no reason for it to try to slow down the download.
Thursday, June 17, 2004
What a great blog article!
This article really puts together a lot of good information about the future of computing. It brings up a good comparison between web applications and rich clients.
I'm hoping that Linux will be able to fill in for Microsoft in the rich client market. The problems that Joel Spolsky wrote about sound too much like the problems developers have with KDE and Gnome. Maybe an article like this will help bring the Linux camp around to a common API.
Wednesday, June 16, 2004
Monday, June 14, 2004
I am not a developer, but I am a heavy tester of software. I like downloading and installing software tools if there is a download out there. I seem to save time if I can install the software in a test environment and look around at the interface and options. Then going back and reading the manual.
I have two computers that I use in my office. One is Windows 2000 which is necessary for Exchange, and unique vendor client/server applications. The other is my Debian unstable desktop/server.
My Windows 2000 computer doesn't work like Debian does. It has fragemented files that can't be fixed -- even in safe mode. I have never had that problem in Linux. The registry is starting to look ugly from all of the software I have installed and uninstalled. Debian's apt-get allows me to install and uninstall software without leaving a trace. apt-get has kept my system clean and sane without needing to reboot unless I update the kernel.
Speaking of rebooting, I have to reboot the
Windows computer way too often. When I install new software. Reboot. When I install windows patches. Reboot. When I installing office/software patches. Reboot. When I uninstall software. Reboot.
The Debian computer allows my to try out enterprise server applications that can be scaled down to a few machines without chasing purchase orders and new licenses. And if I am just testing something I would be willing to buy, I don't have to worry about shareware restrictions. I can keep my installation and configuration the same from testing to production.
My Debian workstation is becoming a useful swiss-army tool for many different ancillary server applications like mrtg, apache, request-tracker, phpbb2, etherape, ethereal, and others. Whereas upstairs, I have 8 rack-servers serving
Thursday, May 20, 2004
I love the new spacial setting. This is a new way to look at desktop managment and it takes a while to get used to it. The annoying part is that a new window will open up for every folder when you left click.
After using it for a while it seems obvious why it is there. The Gnome Development team realized that people can easily remember items spacially, so when you open your home folder it always will show up in the same location you left it and the same size. This unconsciously helps me know what folder I am working in.
Nobody wants to see more than 3 folder windows on the desktop, so you have to press Ctl-Shift-W or when you click on a new folder you can use the middle mouse button or press the scroll wheel. It does take some time getting used to the new commands. I have my system set up to work with a single click which helps.
Q: When will GNOME 2.6 go into sid/sarge?
A: When it is ready and when sarge and sid are ready for it. This means that we have to wait until the GNOME 2.4 packages in sarge are not expected to break, as we can't update them after we've uploaded GNOME 2.6 to unstable. This means you can help us by testing the old GNOME 2.4! Yay!
Friday, May 14, 2004
I have heard that java is pretty much open source too. However, it doesn't allow for distributions to distribute the code or something. I am also concerned that there is a lot of optimization that I will have to do.
Tuesday, May 11, 2004
You are going to need the source code to your kernel at /usr/src/linux (create the symbolic link or rename the folder). Copy your distributions configuration file from the /boot/ folder (Debian names it something like config-2.6.5). You will have to have some kernel parts compiled in order for the linux-wlan-ng module to work. I ended up creating a whole new kernel.
Run the ./Configure from the linux-wlan-ng folder. It asks you some basic setup questions and attempts to load everything. This didn't work for me. I then ran 'make all' and 'make install' to have all modules built and installed into the correct /lib/modules/'uname -r'/ folder. Run 'depmod' and 'modprobe -l | grep prism' to see if you have the prism2_usb module. Then load the module with 'modprobe prism2_usb', the 80211b driver also will get loaded.
I was also able to have hotplug (http://linux-hotplug.sourceforge.net/) recognize my toy by copying the output of 'cat /lib/modules/'uname -r'/modules.usbmap | grep prism2_usb' to /etc/hotplug/usb.usermap. I think this is what I used 'cat /lib/modules/'uname -r'/modules.usbmap | grep prism2_usb >> /etc/hotplug/usb.usermap'
Now I don't have to have the USB-400 plugged into my laptop until I am ready to use it.
To have it connect to a wireless network, I had to change the /etc/network/interfaces to include:
iface wlan0 inet dhcp
#default was the name of my router I wanted to connect to.