Wednesday, October 27, 2010

Open Source Spotlight – FreeFileSync

In our household, we currently use five PC’s for various activities.  One drives the television, one for the kids, a desktop that I do work on, and two laptops.  Historically, keeping the files up-to-date has been a major drag.

In years past, I’ve tried various strategies to keep our documents, photos, music, and video current.  Certainly one good solution is to keep a centralized server, and connect the client PC’s to that server.  In this scenario, there is one master set of files, and everyone always has the most current available to them.  Microsoft now makes the Home Server, which is designed for this task.  I’ve also used various flavors of Linux such as Ubuntu to run a home-based server from.

For us though, that didn’t work.  First of all, I’m kind of a backup freak, and having all of my documents stored on a single moving platter doesn’t set well.  Sure I could implement a RAID array,  but then we start talking about more money and complexity.  Secondly, I was beginning to do quite a bit of travel, and back in the day of slow, expensive Internet service, there was just no good way to get to the files remotely.  The final blow was when I started measuring the power consumption on various devices, and how much they were costing me to run – that was the deciding factor that I didn’t “need” a dedicated file server running 24x7.

Another strategy that I’ve tinkered with is using an online data hosting service.  In the past, these seemed to be more work than they were worth – they had crude interfaces, limited storage, and I still had the issue of relatively slow Internet connectivity.  Today, Dropbox.com seems to have a viable option that I may pursue, whereby the files get stored on my local computer, but they get replicated out to the web.

 

But for the time being, I use a more direct approach file management – FreeFileSync.  FreeFileSync is an open-source program that lets you very quickly compare two folders, identify the differences, and synchronize them.

The program has some options to let you customize the general behavior, but usually the default options are acceptable.  For me, I point the left pane at my local C:\ drive data directory, and the right pane at my Desktop PC, using a UNC path.  Clicking the Compare button will initiate the file comparison, which on modern hardware over a LAN network, takes about 20 seconds for my 15,000 files.  A sample of the results can be seen here.

freefilesync-screenshot

The blue and green arrows in between the left and right pane shows which direction the software has determined that the files should be updated.  The software has a fairly sophisticated algorithm to determine the correct action, based on file timestamps, as well as an internal database of past synchronizations.  This internal database is how FreeFileSync knows when a file has been deleted, and the corresponding file on the opposite side of the sync should also be deleted.

Any of these default actions can be overridden by clicking in the center column before the synchronization takes place.  Once you’ve reviewed the actions, clicking the Synchronize button at the right puts the changes into effect.  This usually takes only a few seconds unless I have large amounts of data that has changed.  Then the time is dependent on bandwidth of your LAN connection and the speed of your hardware.

In all, I usually synchronize each of the laptops to the desktop about once a week.  It generally takes less than  two minutes to complete.  When I’m done, I have local access to any of my files from each of the PC’s, plus I have three separate copies of my irreplaceable photos and data.

 

I give FreeFileSync five out of five stars.

Sunday, August 22, 2010

Repairs to a DRE-4000 Aviation Headset

Since getting my ticket back in 2004, I’ve been using a pair of DRE headsets in the local FBO’s Skyhawk.  The headsets have worked quite well considering the $150 entry price.
A little over a year ago, I noticed that the cord was beginning to fray.  At that point, there were no usability problems, just cosmetic, but it was obvious that it wouldn’t last forever.
IMG_3191
Sure enough, one spring afternoon I loaded up the plane, and one the lucky passenger had no Mic audio.  He could hear, but not talk.
I priced around a little bit, and I can buy a replacement cord for about $35.  Add shipping to that, and you’re not too far away from a down payment on a brand new headset.  Instead I decided to dig in and see if a repair was possible.
One of the things that I do like about the DRE’s is that the cord is replaceable with two thumb screws.  I pulled the plug, which allowed me remove the cover over the wires.
 IMG_3192
Obviously the yellow wire has a problem.  It’s impossible to visually inspect if any of the other wires have issues since they’ve been potted in some sort of resin.  The bad part about this type of design is vibration and movement of the of the potting material can cause the solder joints to break or become intermittent deep inside where they can’t be inspected or repaired.
Begin by taking a pair of pliers and break up the resin that’s encasing the existing wires.  Don’t break the connector itself, but the wires will all have to be re-soldered, so don’t worry about them.
When making a repair like this, it’s best to cut the top couple of inches off of the cable and start fresh.  Begin by putting the outer shell of the connector back on the wire (near my thumb).  Strip the outer insulation back about an inch to expose the wires.  Cut the braid off flush with the outer insulation.  (Notice the “key” shown here on the shell of the connector.  This keeps the connector from only being inserted in one direction, and is critical in later steps.)
 IMG_3193
Separate the wires, and strip back about 1/8” of insulation on each.  Tin them with solder.  Also prepare the connector (at right) by tinning the pins and removing any old wire that may have been left.
 IMG_3195
Before final assembly, notice that there are two sides to the connectors, and that it will only fit into the headset one direction.  The order shown here is with the “key” at the back (not visible) of the photo.
DISCLAIMER: Fortunately, I had a second identical headset that I was able to ring the wires and connectors to determine the proper sequence of colors.  There’s no guarantees that these colors are the same for any other DRE-4000 headset, but they probably are.  If in doubt, consult your avionics shop.
The order (from left to right, with the key in the back) is Red, White, Brown, Green, Black.   Solder the wires to the pins, making sure that the shell has already been placed over the cable and is ready to be slid into position.
IMG_3197
Like I said earlier, I really don’t like the potted connector/strain-relief solutions, but unfortunately there aren’t many options with this particular design.  Once you are comfortable with your solder work, mix up some 5-minute epoxy and drizzle it into the connector shell.  This epoxy is the only thing keeping the wires from being ripped out, so use plenty.
Finish pulling the shell down over the connector and then apply gentle pressure to the sides (I used clothes pins) to keep everything tight while it cures.  The hole in the headset that this connector fits into is fairly tight, so keeping pressure on it during the curing process is essential.
IMG_3202
That’s it.  Once it has cured, re-insert the plug into the headset, and replace the thumb screws.  If the wiring was correct, you should be good to go for a couple more years.  If it wasn’t correct, I’m sorry to tell you that you probably won’t be able to fix it.  The plastic used on my plug was just barely strong enough to survive one reconstructive surgery, and I don’t expect it to make it through a second.
Happy Flying!

Monday, May 3, 2010

Replacing the Battery in your Uninterruptable Power Supply

I decided several years ago (after a brief power outage) that I had better things to worry about than if I had saved my open files recently enough when the power flickers.  So I picked up an APC Back-UPS 650VA power supply to keep my computer running through a power outage.

apc-front

Fast-forward about four years, and I’m sitting at my PC working, when my UPS beeps and the screen goes black – exactly what I had been trying to avoid.

 

The most common cause of UPS problems is a failing battery.  The first step in diagnosing the problem is to open up the case to reveal the battery.  Usually the batteries are quite easy to get to – either the front of the box will pull off, revealing a couple of screws, or there will be a couple of obvious screws on the bottom of the unit.  For my Back-UPS 650, it’s the latter.

apc-removing-bat

Pull the battery and disconnect the positive and negative battery leads.  These are normally just spade connectors on the smaller UPS’s, and no tools are required.  This is your first chance to inspect the physical condition of the battery.  Many of the batteries that I’ve had go bad actually produced enough heat internally to cause the sides to bulge.  If the battery has been deformed, it must be replaced.  Sometimes, the battery will actually start leaking, and you’ll see a white powder crusted around the chassis or battery.  This is also a dead-giveaway that your battery is shot.

A standard DC volt meter is often enough to diagnose a battery beyond simple physical symptoms.  Normally, a battery that is actively being charged will read about 13.8V.  Once you disconnect the charger, you should still see about 12.5V.  If you read anything less than about 12.0V, then you are missing a cell (or more) in the battery and it must be replaced.

Once in a while, a battery will hold its voltage when it’s sitting idle, but as soon as a load is placed on it, it will drop.  For a test load, I’d recommend something that will load it down with about 1 amp, such as a 10 ohm, 20W resistor.  Another option would be 12V car brake or dome light if you happen to have one handy.  Measure the voltage before you attach the load across the positive and negative lead, and again while you have the load attached.  It’s normal to see a drop of a couple tenths of a volt, but if the battery drops below 12.0V, it’s probably shot

 

While it’s possible for other parts of the UPS to fail or become damaged, 95% of the UPS’s I’ve worked on are because of an aged and/or abused battery.  The easiest way to find a replacement is to measure the physical dimensions and look for a replacement on a site such as www.digikey.com.  (I say that this is the easiest, because usually the original battery that comes from APC does not have any kind of Amp-Hour rating on it – they do that to discourage you from buying your own replacements…)

Search for “Sealed Lead Acid batteries”, and begin comparing the physical dimensions of the batteries.  Normally, you will be looking for a 12V battery, usually in the 7AH to 28AH range.  The physical dimensions are pretty well standardized across the industry, just be sure that it will fit inside the chassis of your UPS, and that the battery posts are compatible with the leads on the UPS.

apc-battery

Once you receive the new battery, installation is exactly opposite from the disassembly.  Often times there is a small spark when you attach the second lead to the battery.  Don’t be alarmed, this is normal.

Put the screws back in, and the UPS is again ready for operation.

Saturday, March 27, 2010

Opinions of Windows 7

I’ve been running ‘7 for four months now, and to summarize, I think that Microsoft has a solid, viable operating system that’s ready for prime-time.  It’s time that both businesses and individuals accept that change happens and learn to deal with it.

My personal take was that Vista was simply ahead of its time – it required new, fast hardware that just wasn’t widely deployed.  On top of that, Vista changed many of the driver models which broke support for many older devices.  And that’s not to mention an almost complete lack of support for 64-bit drivers.

The industry has largely caught up, and Windows 7 capitalizes on that.  Unless you’re still running the same PC that was considered old three years ago when Vista came out, you’re probably in pretty good shape to run 7.  The video card is the most significant hardware requirement in 7 that I’ve run into problems with, but can usually be resolved for about $45 and 30 minutes of your time.

For those coming from Windows XP, the learning curve is going to be steep, but not insurmountable, even for the least-tech-savvy users.  It just takes patience, and maybe some coaching from someone who has made the jump.  Many (although not all) of the XP machines will need to have hardware updated or replaced in order to make the jump.

For people like me who updated the hardware and had jumped to Vista already, your curve to move to Windows 7 is pretty much a non-event.  Your hardware should be capable, and there aren’t any new major changes like there were in Vista.  My opinion is that if you liked Vista, you’ll love Windows 7.

 

What’s so great about it?

First of all, I really like the licensing change that Microsoft did for home users.  If you’re a household like mine, you have multiple PC’s, and the thought of spending $125+ each on an upgrade to the latest and greatest just isn’t going to happen.  For that, Microsoft came up with the Windows 7 Home Premium Family Pack.  For $150, you get three licenses of ‘7 that you can put on your home PC’s.  And contrary to to the uber-helpful Best Buy employee’s training, yes it is valid to upgrade your Windows XP machines with this 3-pack license.  The only caveat with XP is you have to install fresh, which means formatting your hard drive and starting over from scratch.

Additionally, there’s basically one “Home” license that most people will need for their home PC’s.  It’s just the Home Premium version.  The only other option would be Windows 7 Starter, but that’s only available for Netbooks.  For businesses, you basically only need Professional or Enterprise, depending on whether or not you’re involved with the Microsoft Volume Licensing and Software Assurance.

Performance is good.  Some will legitimately argue that ‘7 uses smoke and mirrors to provide the illusion that it’s running faster.  Yes, ‘7 does cause some non-essential services to delay before starting at boot-time, and switches off other services by default.  In this case though, perception is reality, and boot times do seem faster than Vista, and probably on par with XP.

There are some subtle GUI changes that are really great.  For example, I am almost always working on dual monitors either at home or work.  Before, if I had a window maximized on the right monitor and wanted to move it to the left monitor, I would have to 1) restore the window 2) drag the window to the left monitor 3) maximize the window.  Now, you can grab the title bar of the maximized window, drag it to the other monitor, and release the mouse while at the top of the screen, and it will automatically maximize it again.  Sure, there are 3rd party utilities that could do that for you, but it’s nice to have a standardized feature built-in to the core OS.

You can also drag a window to the far left or right of a screen, and it will stretch the window to the full height, but will only take up half of the screen (either the right or left half). 

There are also a keyboard keystrokes that stick windows to the right or left of the monitor.  Hold down the Windows Key, and press the right or left arrow keys.  If you have dual monitors, you can move a window between the monitors by just hitting the left or right arrows a couple of times.  Finally, you can maximize or restore a window by using Windows Key and the Up/Down arrows.

The Shutdown option is easier to use than in Vista.  It now defaults to Shutdown (instead of Sleep).  If you don’t want to shutdown, you can hover the mouse over the arrow right beside the Shutdown option for about half of a second, and a menu will fly-out giving you all of your Restart, Logoff, Sleep, etc options.

image

If you want to take a quick peak at a calendar, you can simply click on the clock in the task-bar, and a calendar will pop up.  You used to be able to do this back in the XP days by double-clicking on the clock, but you had to be careful because that’s how you changed the date/time too.  Now, you single-click it, and you have a calendar that you can thumb through the months and years, but don’t have to worry about accidentally altering the system time.

image

There are several new programs included with the OS that I find helpful.  I won’t go into details of what they do exactly, but make it a point of trying out the Snipping Tool for taking screenshots and the Problems Steps Recorder when you’re trying to communicate a problem with Tech Support.  The Calculator program has been updated, and has different modes called Scientific, Programmer, and Statistics. 

image

It’s not exactly a program per se, but you can now burn ISO images directly to a CD-ROM or DVD by just right-clicking on the ISO file.  No more trying to remember if you installed Roxio or Nero on this particular computer, and where you stuck the shortcut in the Start Menu.

Outstanding Issues

I do have a couple of issues with my installations that I wish I could figure out.  At work, I have a Dell Optiplex 755 that used to have a dual-head ATI card in it (sorry, I’ve forgotten the model).  The card worked fine on Vista, but after upgrading to ‘7, the fan on the video card began to randomly cycle on and off as if it was hot.  Usually a reboot would make it stop doing that, but one day it was driving me up the wall and I wound up cutting the cord to the fan.  It solved the issue, and ran fine (which further leads me to believe it wasn’t actually a heat issue, but instead a driver problem).  I finally replaced it the other day with an nVidia card out of precaution.

That same Optiplex also has problems shutting the NIC down when it goes into sleep, which was never a problem on Vista.  I’ve tried some different settings with no change.  I recently flashed my BIOS to the latest A16 version, but since then I’ve had it lock up tighter than a drum twice and needed to be powered down.

Sleep modes are also an issue on my Dell XPS 420 at home.  It tries to go to sleep, but randomly wakes up for no apparent reason.  Again, I’ve tried changing some of the wake-on-USB and wake-on-NIC settings, but no joy, yet.

I will say that I’ve had more lockup’s and blue-screens on Windows 7 in the last four months, than I had on Vista and XP over the last four years, but most of them can be attributed to getting the sleep work correctly, as well as some new CAD software that I’ve been testing.

 

Conclusion

I hated to end this on a sour note.  Yes, I’ve had some stability issues that are new to this OS, but keep in mind that both of these PC’s are operating on hardware that was not “designed” for ‘7.  I’ve also been diving into designing printed circuit boards at home using a whole slew of new CAD programs, some of which I’m finding aren’t the most stable works in the industry.

Beyond that, I love the OS, and have quickly become spoiled to some of its features.  To me, XP is beginning to look very dated and archaic. 

The migration to Windows 7 does require a person to step outside of their comfort zone and embrace change, but in the end, it’s a good thing.

Robocopy - My New Best Friend

I’m not sure how I managed to miss this free and useful utility, but Robocopy has recently become one of my favorite Windows command line tools.  With a single command, you can keep two directories in sync with each other in a very fast, efficient manner.

In the past, I’ve managed to do some fairly cleaver things with Xcopy and other batch files, but I never felt that it was working very efficiently, and it was difficult to write and maintain.  Now, with the Robocopy tool I can quickly copy new files from my working directory, off to a second location for backup or disaster recovery purposes.

Although it has many switches and options, most of my applications utilize either no special switches, or the /mir option.

Without switches, Robocopy will send all new files over to the destination.  This is very similar in concept to xcopy /d, which copies over files that have a newer timestamp.  Although it’s useful in certain applications, such as wanting to keep a copy of all files in a project, even ones that were deleted mid-stream, this isn’t always what you want.  The problem is that if a file is deleted from the source, usually you want your backup copy to reflect that deletion.

And for that is the /mir switch.  With this switch, it copies over all newer files just like before, but now it will analyze the destination, and if a file exists in the destination but not in the source, it will delete it.  This is an excellent way of keeping a copy of your data files on an external hard drive or on a second PC.  Like I said earlier, the mechanism is quite efficient, so even if you’re replicating files to a server off site, only the changes (at a file-level, not block level unfortunately) are copied over.

A case in point, I replicate about 100GB of user data (approximately 80 users) data every day across a 3.0Mbps WAN link, and it takes right at an hour most days.  Your mileage will vary…

 

I don’t have installations of each to verify, but I believe Robocopy is installed by default on Windows Vista and 7, as well as Server 2008 and 2008R2.  For XP and Server 2003/2003R2, you will need to download it as part of the Windows Resource Kit from Microsoft.  Installation is a no-brainer as shown in the screenshots below.

 

Once it’s installed, you can execute the command such as the following:

robocopy c:\source e:\destination

Where C:\source is the directory you want to back up, and E:\destination is your backup location.  This is a relatively safe operation, as no files would ever be deleted on either end. 

Once you are comfortable that things are being replicated correctly, if you want to replicate the deletions in C:\source to E:\destination, add the /mir switch.

robocopy c:\source e:\destination /mir

And that’s it, you’ll have an exact copy of C:\source in C:\destination.

 

Finally, a few other useful switches. 

The combination of /r:3 /w:3 tells Robocopy to Retry files that are in use 3 times, and to Wait 3 seconds between tries.  By default it waits 30 seconds, and retries a million times.  That seems a bit excessive.  Most of the time for me, if a file is in use for more than 9 seconds, then it’s probably safe to move on and try backing it up again tomorrow.

A friend recently found out (almost the hard way) that the /xjd switch makes it safer when backing up user data on Windows Vista and 7 machines.  According to the documentation, this switch excludes junction points for directories.  In practice, it can prevent circular references to some of Window’s special directories such as C:\Users\username\Application Data\.  Note, this option doesn’t exist on Server 2003 or XP.

And finally, if you are scheduling the job, the /log:"c:\logs\backup.log" parameter is handy so that you have a record of what went on.  Note that at least one Server 2008, you cannot write the log file directly to the root of the C: drive, so I now always create a C:\logs\ directory to dump my robocopy logs into.

For me, a typical Robocopy command winds up looking like this:

robocopy c:\source e:\destination /r:3 /w:3 /xjd /mir /log:"c:\logs\backup.log"

 

Keep in mind that this process is intended to copy a working directory to a backup drive/site.  If you have two working directories (such is the case if you have copies of your photos on both your laptop and desktop) that you want merged, then there are probably better tools out there. 

For that situation, I use FreeFileSync, which is an open-source project on SourceForge.  The reason for using a GUI is simple – you don’t always want to keep the file with the newest timestamp.  By having a GUI with a list of the proposed changes on screen prior to any action taking place, you get the opportunity to override the defaults.

Good luck and don’t be DUMB!  Remember, Disaster Usually Motivates Backups!

Friday, January 8, 2010

DansGuardian Content Filtering with AD Integration

For our web/content filtering at work, we’ve used DansGuardian for several years with good success.  Originally, we used a small service called identD on each of our Windows 2000 and XP PC’s.  When Vista came out, the service failed to install.  We were also running into complications as we were starting to move towards multi-user Terminal Services environments (the identD service was machine-centric, not user-centric).
I toyed with porting the service over to .NET to run on Vista, but ultimately, I decided that it was time to bite the bullet and use a built-in authentication method called NTLM, which could be facilitated by the Squid proxy server.
The following steps were used to configure the system on a Debian 4.0 (Etch) server. The process was largely borrowed from an article on HowToForge.

Theory of Operation

DansGuardian is a web filtering program that watches the content of a web page, and based on a number of criteria, decides whether or not to block the page. Unfortunately, DansGuardian makes it very clear that it is NOT a proxy or transport program, and therefore needs a very close tie-in with the Squid proxy server.
Traditionally, DansGuardian is configured to listen on port 8080, and Squid is configured to listen to the localhost on port 3128. A client PC is then setup to traverse through the proxy host on port 8080. When a request is made, Dansguardian passes the request on to Squid on port 3128, and filters the content bases on Squid’s reply.
Dansguardian flow chart - old
In order to introduce NTLM Authentication into the process, we have to utilize Squid’s NTLM_Auth functionality, and therefore the PC needs to talk directly to Squid. Now the PC talks directly to the Squid, which handles the NTLM authentication, Squid passes the request (and username information) to Dansguardian, which in turn requests the information back from Squid on the secondary port 3128. The second request to Squid is what is actually passed on to the Internet for its reply.

Dansguardian flow chart
The solution is a bit complicated, but in general works quite well with Internet Explorer and Firefox. The problem comes into play when a browser is not NTLM-enabled. This actually happens more frequently than you might think in real life, as most of the Java runtimes don’t seem to be compliant. This causes a number of problems when it comes time to run various Java applets over the Internet, or when trying to hold a Java-based webinar.
Technically, it will work, as Squid is designed to fail-back to a Basic Authentication and the browser will prompt the user for their domain login credentials, however that is annoying to the user at best, and a bad security practice at worst. It really isn’t a good idea to tell staff that it is okay or even normal to provide their domain credentials whenever a web application asks for them.
Hence, we created an ACL group within the Squid configuration that is coded to look for the headers from a group of known non-compliant “browsers” such as Java and Google’s Chrome. This workaround is explained in further detail throughout the rest of this article.

A Few Assumptions

For the sake of example, I will use the following names when defining my network going forward.
  • acme.local – This is the local (internal) domain suffix.
  • ACME – The old NetBIOS name for the domain.
  • etch1.acme.local – This is the Debian (Etch) server that DansGuardian is being installed on.
  • dc1.acme.local – This is a Microsoft Server Active Directory Domain Controller.
  • 192.168.0.0 / 255.255.255.0 – Class C subnet the PC’s are located in.
  • 192.168.0.11 – Windows DNS name server 1.
  • 192.168.0.12 – Windows DNS name server 2.

Install the Necessary Packages

This article assumes that a Debian 4.0 (Etch) system is up and running, with basic network connectivity established. For the enhanced DansGuardian logging features to be used (covered in a separate post), Apache, MySQL, and PHP will need to be installed, and it is helpful to have phpMyAdmin available for configuration and testing of the database.
To install and setup the filtering proxy server, use aptitude to install the following packages.

  • squid

  • dansguardian

  • samba

  • winbind

  • krb5-user

  • ntp

  • ntpdate
During the installation, it will prompt you for the following parameters. Answer them as follows:
Please specify the workgroup you want this server to appear to be in when queried by clients. acme.local
Modify smb.conf to use WINS settings from DHCP? No
Kerberos servers for your realm: dc1.acme.local
Administrative server for your Kerberos realm: dc1.acme.local
Once all of the packages are installed, run the following command to configure the Kerberos.
dpkg-reconfigure krb5-config
It will then prompt you with the following questions. Answer them as follows:
Default Kerberos version 5 realm: acme.local
Does DNS contain pointers to your realm's Kerberos Servers? Yes

Configure the Name Resolution

We aren’t using the proxy server as a direct gateway to the Internet (with multiple NIC’s and such), so at this point, we deviated from the HowToForge article. In our case, it’s important that the /etc/resolv.conf file is pointed at our internal DNS servers, and that the search domain is configured correctly. Our resolv.conf file should look like this:
search acme.local
nameserver 192.168.0.11
nameserver 192.168.0.12

Synchronize the Time with the Windows Domain

Next we need to configure the NTP client to pull from our internal domain controller. Edit the /etc/ntp.conf file, and find the section near the top that lists the server(s). Be sure the default servers are commented out, and add a new line:
server dc1.acme.local iburst
Now that the client is configured, initiate a synchronization with the following command:
net time set –S dc1.acme.local

Configure Samba

It’s always a good idea to make a backup copy of the original configuration file that is installed with a new package. Make a backup copy, and vi the original file.
cp /etc/samba/smb.conf /etc/samba/smb.conf.original

Make the following changes to the /etc/samba/smb.conf

  • On line 53, set the interfaces = 192.168.0.0/255.255.255.0.

  • Uncomment line 59.

  • Uncomment line 91 and change to security = ads.

  • Uncomment lines 204 and 205.

  • Add the following lines before line 217:
winbind trusted domains only = yes
realm = EXAMPLEDOMAIN.LOCAL
winbind cache time = 3600
Now restart the Samba and Winbind daemons.
/etc/init.d/samba restart
/etc/init.d/winbind restart

Join the Domain

To join the domain, use the net command
net ads join –U Administrator
As I recall, it prompts you for a user’s credentials who has the authority to join a domain. This can be any Domain Admin account that has privileges to join the domain, such as Administrator.
To test the join, you can issue the following commands and see their output:
etch1:~# wbinfo –t
checking the trust secret via RPC calls succeeded
etch1:~# wbinfo –u
ACME\administrator
ACME\guest
ACME\tsinternetuser

ACME\ghost_machine-09
ACME\user1
ACME\user2


etch1:~#
wbinfo –g
BUILTIN\administrators
BUILTIN\users
ACME\domain computers
ACME\group policy creator owners
ACME\domain guests
At this point, you can be assured that the Linux box is a member of the domain, and that it is talking to the domain controllers.
You can also launch the Active Directory Users and Computers snap-in on a Windows PC, and expand the Computers branch. You should see the name of the Linux server listed as a “Computer”.

Configure the Squid Proxy Server

This is the section that I had the most trouble with. I wound up making a copy of the original configuration file with all of the default settings and comments, and then stripped them all out for my running configuration. It just got to be a hassle scanning through 3000 lines of comments for 30 lines of configuration.
cp /etc/squid/squid.conf /etc/squid/squid.conf.original
Then I used a grep command to eliminate the comments.
grep –v “^#” squid.conf > squid.conf.clean
This left a lot of blank lines in the file, but they were easy enough to remove with UltraEdit. After the configuration changes, I was left with the following configuration file:
#
# Squid configuration file -- Stripped of comments for clarity
#

# There are actually two proxies running - 1 for Dansguardian
# (from localhost) and the other for the masses
# The transparent proxy is bound to the localhost IP and listens on 3128
http_port 127.0.0.1:3128 transparent

# This one is bound to all IP's, and listens on port 8080. Port 8080
# is the default Dansguardian port. In
# our case, Dans has been reconfigured to use port 8081 instead to
# avoid confusion.
http_port 8080

# This parameter tells squid to pass the login credentials through to Dans
cache_peer 127.0.0.1 parent 8081 0 no-query login=*:nopassword

# The following 7 lines are default Squid configuration
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
access_log /var/log/squid/access.log squid
hosts_file /etc/hosts

# The following 3 lines configure NTLM authentication for browsers.
# This is the primary method used for proxy authentication
auth_param ntlm program /usr/bin/ntlm_auth --helper-protocol=squid-2.5-ntlmssp
auth_param ntlm children 5
auth_param ntlm keep_alive on

# This is a failsafe authentication in case the client application
# doesn't support NTLM. It uses Basic
# authentication and still authenticates off of the same ntlm_auth piece
auth_param basic program /usr/bin/ntlm_auth --helper-protocol=squid-2.5-ntlmssp
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours

# The following 25 lines are default Squid configuration
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 # https
acl SSL_ports port 563 # snews
acl SSL_ports port 873 # rsync
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Safe_ports port 631 # cups
acl Safe_ports port 873 # rsync
acl Safe_ports port 901 # SWAT
acl purge method PURGE
acl CONNECT method CONNECT

# These are custom configurations for our environment.
# First we are creating an ACL group for people who were
# authenticated by the NTLM
acl ntlm_users proxy_auth REQUIRED

# This is a generic ACL of valid IP addresses on our network
# that have access to the proxy
acl our_networks src 192.168.0.0/24

# Some browsers don't support NTLM authentication. Rather
# than harass the user with pop-up's, we are excepting
# out known browser issues from the NTLM credentials.
# We know that Java generally does not support NTLM
# (although some newer versions may)
acl non_ntlm browser Java/1.4 Java/1.5 Java/1.6

# Oddly enough, Google's Chrome browser does not support NTLM
# authentication
acl non_ntlm browser Chrome

# The following 6 lines are default Squid configuration
http_access allow manager localhost
http_access deny manager
http_access allow purge localhost
http_access deny purge
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports

# Now we're actually allowing appropriate users access the proxy.
# The first step is to except out the non_ntlm browsers that
# were defined above. This bypasses that authentication
# scheme before it gets to the allowance of ntlm_users
http_access allow non_ntlm

# We want the localhost to be able to proxy
http_access allow localhost

# And finally, this is the line that allows anyone on
# our network, that has been authenticated by the NTLM piece to
# get through. It's not real intuitive, but it seems
# that it only authenticates the browser when it actually gets
# to this line. In other words, non_ntlm browsers that
# were allowed above don't get prompted.

# Note that any browser that bypasses the NTLM authentication
# will show up in the logs without a username.
http_access allow our_networks ntlm_users

# The following 4 lines are default Squid configuration
http_reply_access allow all
icp_access allow all
cache_effective_group proxy
coredump_dir /var/spool/squid
The important lines are near the top. I’ve included my own comments to explain what was actually changed and why I did that.
At this point, you can start the Squid proxy server, although it won’t really do anything until DansGuardian is setup.
/etc/init.d/squid restart

Set the Permissions for the winbindd_privileged Directory

Squid needs access to /var/run/samba/winbindd_privileged. We can easly fix this but the permissions will reset when we reboot. Jesse Waters on ubuntuforums.org posted a script that will set the permissions on every system boot.
Create a file named /etc/init.d/winbind-ch.sh and paste the following into it.
#!/bin/sh
#set –x
WINBINDD_PRIVILEGED=/var/run/samba/winbindd_privileged

chmodgrp() {
  chgrp proxy $WINBINDD_PRIVILEGED || return 1
  chmod g+w $WINBINDD_PRIVILEGED || return 1
}

case "$1" in
start)
chmodgrp
;;

restart|reload|force-reload)
echo "Error: argument '$1' not supported" >&2
exit 3
;;
stop)
;;
*)

echo "Usage: $0 start|stop" >&2
exit 3
;;
esac
#EOF
Add this script to the init scripts to run at boot time by running this command:
update-rc.d winbind-ch.sh start 21 2 3 4 5 .
Go ahead and execute this script to set the permissions.
/etc/init.d/winbind-ch.sh start

Configure the DansGuardian Web Filter

Again, I deviated quite a bit from the original article, primarily because I already had quite a bit of experience with DansGuardian.
cp /etc/dansguardian/dansguardian.conf /etc/dansguardian/dansguardian.conf.original
Now vi the dansguardian.conf file and make the following changes:

  • On line 3, comment out the item that says “UNCONFIGURED…”

  • On line 44, change the logfileformat to 2 (CSV-style format)

  • On line 62, change the filterport to 8081

  • On line 102, I set my filtergroups to 3. We have three groups in our environment – IT, Management, and everyone else. IT gets nearly everything like .EXE’s, .ZIP’s, .DOC’s, .XLS’s. Then we have Management which gets a little less like only .DOC’s, and .XLS’s. And finally, we have everyone else who don’t get to download many of the privileged file types like Microsoft Office types or executables.
Configure the three groups as appropriate. Start by defining the groups. Vi the /etc/dansguardian/filtergroupslist file. Our list looks like this:
#filter1 = DEFAULT USERS (Everyone in this group unless otherwise noted)
#filter2 = IT Users (Can do just about anything)
ACME\administrator=filter2
ACME\ituser1=filter2
ACME\ituser2=filter2

#filter3 = Management (Most things except EXE's)
ACME\manageuser1=filter3
ACME\manageuser2=filter3
ACME\manageuser3=filter3
Now begin customizing the setup of the individual groups by vi’ing the dansguardianf1.conf file: Make the following changes:
Near the top, set the Banned Extension List and Banned MIME Type List to be specific to the filter group.
bannedextensionlist = ‘/etc/dansguardian/filter1/bannedextensionlist'
bannedmimetypelist = '/etc/dansguardian/filter1/bannedmimetypelist'
I have our naughtynesslimit set to 125, which seems to be appropriate for a workplace environment.
Now make duplicate dansguardianf1.conf files for each of the three groups.
cp /etc/dansguardian/dansguardianf1.conf /etc/dansguardian/dansguardianf2.conf
cp /etc/dansguardian/dansguardianf1.conf /etc/dansguardian/dansguardianf3.conf
Go in a edit both of the dansguardianf2.conf and dansguardianf3.conf files to have the appropriate filter directory for the Banned Extension List and Banned MIME Type List. For example, dansguardianf2.conf should read:
bannedextensionlist = ‘/etc/dansguardian/filter2/bannedextensionlist'
bannedmimetypelist = '/etc/dansguardian/filter2/bannedmimetypelist'
Create the three filter directories and copy the two files from the root DansGuardian directory into these new folders.
mkdir /etc/dansguardian/filter1
mkdir /etc/dansguardian/filter2
mkdir /etc/dansguardian/filter3

cp /etc/dansguardian/bannedextensionlist /etc/dansguardian/filter1
cp /etc/dansguardian/bannedmimetypelist /etc/dansguardian/filter1
cp /etc/dansguardian/bannedextensionlist /etc/dansguardian/filter2
cp /etc/dansguardian/bannedmimetypelist /etc/dansguardian/filter2
cp /etc/dansguardian/bannedextensionlist /etc/dansguardian/filter3
cp /etc/dansguardian/bannedmimetypelist /etc/dansguardian/filter3
For each of the three groups, edit the /etc/dansguardian/filter1/bannedextensionlist and bannedmimetypelist as appropriate. The included list files that come with DansGuardian include a list of virtually every possible extension. If it’s listed in the file, then those users won’t be able to download that type. Customize the list by commenting out extensions that you DO want these users to be able to download.
I recommend that management has at least .DOC, and .XLS capabilities, or else they will be calling IT on a regular basis.
IT generally has unique needs, and so they should additionally be given access to download .ZIP, .EXE, and .ISO files.
Finally, we found that some websites consistently come up as blocked, even though they’re legitimate or even directly business related. For those sites, edit the /etc/dansguardian/exceptionsitelist and add the domain name for the offending servers. We currently have about 120 domains that we are explicitly excepting out of the filter for various reasons. Some general examples are microsoft.com (for patches), adobe.com (for Acrobat Reader) and so on.

Configuring the Client PC’s

The only configuration that we have to do on Windows 2000 or Windows XP machines is to configure the proxy server settings. There is a DNS alias A record that points the proxy.acme.local address back to the Etch1 server. The actual settings are normally done from within Desktop Authority with a KIX script named win-ie.kix.
To manually make the change, open Internet Explorer. Click on Tools and click on Internet Options. Select the Connections tab from the top of the screen. Click on LAN settings. On the bottom half of the screen, check to box to Use a proxy server… and set the address to proxy.acme.local. The port should be set to 8080.
image
For Windows Vista and Windows 7 machines, there is one additional step that is required. By default, Vista doesn’t allow NTLM version 1 traffic to pass. Unfortunately, the Samba project is not quite ready to support version 2, so we have to reconfigure Vista PC’s.
Go to Start-Run and type in secpol.msc. The Local Security Policy manager screen will come up. Drill down into Local Policies, and then click on Security Options. In the list of the right will be Network security: LAN Manager authentication level. Change this option to Send LM & NTLM – use NTLMv2 session security if negotiated. Click OK and you can close out of the Local Security Policy editor.

Troubleshooting Domain Connectivity/Authentication

To troubleshoot the domain connectivity, refer to the section titled Join the Windows Domain. First of all, you should see the server name listed in the Active Directory Users and Computers. Seeing this indicates that your server has registered with the domain. If it won’t, it’s possible that Samba wasn’t configured, or that the /etc/resolv.conf is not pointing to the correct internal DNS servers.
If you haven’t done so recently, it’s never a bad idea to restart the Samba and Winbind daemons.
/etc/init.d/samba restart
/etc/init.d/winbind restart
Another potential problem could occur if the server has the same name as a previous machine on the domain. It may become necessary to delete the computer’s registration from the domain by opening Active Directory Users and Computers, drilling down into Computers, and delete the computer that has the name of this server. At that point, you would need to re-join the domain by issuing:
net ads join –U Administrator
Running the wbinfo command with the –t, -u, and –g should enumerate all of the users and groups on the domain. You can also manually attempt an authentication of a Windows domain account by issuing the ntlm_auth command.
ntlm_auth --username=ituser1 --domain=ACME
It will prompt you for the password for this selected account, and assuming that the username and password was correct, it should return:
NO_STATUS_OK: Success (0x0)
If Squid suddenly stops authenticating users, but the above commands continue to work, then it is probably a permissions issue between Squid and Winbind. The /etc/init.d/winbind-ch.sh script shown in the Set the Permissions for the winbindd_privileged Directory may not be running, or may not be running at the correct point during startup. You can manually run that script at any time by issuing:
/etc/init.d/winbind-ch.sh start

Troubleshooting Squid Configuration

First of all, be sure that the server has Internet connectivity. It should be able to ping a site such as www.google.com. This tells you that DNS name resolution is working, as well as the routing out to a foreign network.
If you are having troubles with getting Squid running, it’s best to bypass all authentication schemes. To do so, make the following changes in the /etc/squid/squid.conf file.

  • Comment out all seven lines starting with auth_param to disable NTLM and Basic authentication

  • In the ACL’s section, replace the line http_access allow out_networks ntlm_users with the following:
http_access allow our_networks
Then restart Squid.
/etc/init.d/squid restart

Troubleshooting DansGuardian

Dansguardian itself has proven to be very straightforward in terms of configuration. In the past, we have seem some instability over long periods of time, so I generally have a cron job setup to restart the service in the middle of the night. I add the following line to my /etc/crontab file.
30 1 * * * root /etc/init.d/dansguardian restart

Client Connectivity Issues

Be sure that the proxy server is configured on the web browser. In Internet Explorer, it’s under Tools-Internet Options, Connections tab. Click on LAN settings. Be sure you have the Fully Qualified Domain Name listed, as well as port 8080. For some sites, especially internal intranets, it may be necessary to create exceptions under the Advanced button.
If you go to a website, and a box now pops up asking for login credentials, it means that either your browser isn’t configured to support NTLM authentication, Squid isn’t configured correctly, or your PC isn’t talking in NTLM version 1.
The incompatible browser tends to be the most common. Sometimes it’s not the browser itself, but a plug-in that gets called such as Java. In that case, you will need to determine the browser User-Agent string that is being passed to the Squid, and add it to the list of non_ntlm ACL exceptions. If you aren’t sure what the agent name that is being passed is, then the simplest way is to use Wireshark to sniff the traffic. You will probably see something similar to this in the trace, usually in the first packet that is being sent to the server.
User-Agent: Mozilla/4.0…
If all browsers are having the problem, including a modern version of Internet Explorer, you may have NTLM version 1 disabled, especially if this is a Vista PC. If this is the case, see the section title Configuring the Client PC where it talks about using secpol.msc to change the network security parameter.

Summary

This configuration of DanGuardian has been in production for us for about a year now, and is working just fine.  The trickiest part of this is really understanding the two portions of the Squid proxy server, and how they interact with DansGuardian.