DSC vs. GPO vs. SCCM vs. MDM

Microsoft Windows administrators now have a number of ways for managing their estates.

  • Group Policy (GPO)
    Allows very fine-grained control over every aspect of Windows. Primarily aimed at Windows desktops. Requires Active Directory (AD) and very careful configuration. Requires well trained specialist staff to get it right.
  • System Center Configuration Manager (SCCM)
    Allows central control over software delivery. Also requires AD. Configuration of delivery packages can be complex and very careful change control is required. Software delivery via SCCM can also be intrusive to users. Requires well trained specialist staff to get it right.
  • Desired State Configuration (DSC)
    Though extended by Microsoft this is actually part of a wider open standard “Open Management Infrastructure” and so applies to other platforms as well including Linux. Mainly aimed at server configurations. Falls into the DevOps camp as it defines server configurations in purely text format and so can be put under source control easily. DSC is typically dynamic and enforces the correct configuration (normally every 15 minutes) which greatly helps ensure secure configurations.
  • Mobile Device Management (MDM)
    Primarily aimed at mobile devices, this style of configuration is increasingly applicable to Windows Desktops with the advent of Windows 10. Microsoft InTune is leading the way with other MDM vendors following on. Not everything on the desktop can yet be controlled this way, even with W10 but many key settings and controls are already available. A much simpler method for enforcing desktop settings than the other methods, it allows fewer administrators and much less specialist knowledge.

The article from FoxDeploy covers the first three of those and lays out the purpose of each. Well worth a read.

What is missing is the 4th method which uses Mobile Device Management tooling. The leading contender for this is Microsoft InTune. However, InTune is really only focussed on Windows 10 (desktop and mobile), it has limited control in other Operating Systems.

Servers only ever exist in a given state. If they deviate or we make changes, we refactor and redeploy. DSC drives it all and the machine will be up and running on a new OS, with data migrated in a matter of minutes.

For all practical purposes, the first true large scale management tool we had for Windows systems in the modern era was Group Policy, or GPO as it is commonly truncated.

Comparatively, SCCM and MDT allow us to we import an image from a Windows install disk and then run dozens of individual steps which are customized based on the target machines platform, model, office location and other factors. The sky is the limit.

Curated from DSC vs. GPO vs. SCCM, the case for each. – FoxDeploy.com

SSH Error: “ssh_exchange_identification: Connection closed by remote host”

Fixing up an SSH login error after moving to a new ISP. “ssh_exchange_identification: Connection closed by remote host” is caused by the lack of a reverse DNS entry.

At home, we’ve just switched from a slow (2mbps) ADSL connection to a wonderfully fast 40mbps connection provided by the excellent Origin Broadband using the South Yorkshire fibre network provided by Digital Region.

I did have a small problem though that has taken some searching to resolve so I’ll detail it here in case others find it useful. Continue reading “SSH Error: “ssh_exchange_identification: Connection closed by remote host””

Flashing the BIOS from Linux (OpenSUSE 11.0)

I’ve been a bit quiet here recently because I’ve mainly been working with my business laptop currently running Windows 7. You can see more about this on my other blog – Much Ado about IT.

However, the power supply on that died recently so I’m back to my trusty desktop which runs OpenSUSE 11.0 24×7.

I managed to get hold of an upgraded CPU for this a while back but I’ve not really had an incentive to upgrade till now. The new CPU supports hardware virtualisation but I need to enable this in the BIOS. Of course, this machine (based on an ASUS A8N-SLI Deluxe motherboard) has an old BIOS that doesn’t allow me to turn on these features so I needed to upgrade to the latest (v1805).

But, I only run Windows under VirtualBox on this computer and I don’t have a floppy drive so updating a BIOS is no trivial matter!

After some Googling, here is the answer:

  1. Install the coreboot-utils package
  2. As root, at a command prompt, run “flashrom” to check that your chipset is supported for writing
  3. Extract the .bin file from the archive containing the updated BIOS image
  4. Make a backup of the existing BIOS with “flashrom -r backup-bios.bin”
  5. For the paranoid, try writing that backup back to the BIOS with “flashrom -wv backup-bios.bin” to ensure there are no errors. Reboot at this point for the really paranoid
  6. Now flash the new BIOS with a similar command to step 5
  7. Reboot and check that the new BIOS is OK

If you get an error from flashrom saying that the new BIOS is the wrong size, you may have had a problem unpacking the bin file from the archive as I did. Unpack the whole archive to a folder.

If flashrom doesn’t work for you, there are lots of other ways – I like using GRUB to boot from a floppy disk .img file – very “Linuxy”.

Version Control for Mortals

Version control systems (VCS, or Revision Control Systems or Source Control Systems) are designed for software developers. They enable one or more people to work on source code, annotate changes, split and merge the code, link to bug tracking systems and a number of other things that are interesting to developers but not to most people!

Indeed then, for most people, you might expect that version control systems are not interesting at all. But you’d be wrong.

What makes them interesting to most people is the fact that most of us are very poor at looking after those all-important files that make up our business assets and often personal assets too. We copy, save and delete stuff without giving too much thought to what we are doing. Then later on we scratch our heads and wonder what happened to xyz piece of critical information. I wonder how many times you’ve gone through your emails to get a document back that you know should be (and may well be) on your hard drive somewhere. With desktop search systems now all the rage, you will probably find the document but then you realised that you used it as a template for another document and accidentally saved over the top! Or it got deleted when you were tidying up the old project folders …

Well, in step the version control systems to save the day. They will benefit anyone who recognise the above scenarios.

I’ve been using a VCS for over a year now for my day-do-day documents. I’m glad I did too as I used it to recover most of my documents after a drive failure earlier this year.

SVN LogoI started by using Subversion (SVN). This is a Centralised Versioning System. It requires a central server that is the hub and master for all documents and changes. It is very well supported and many low-cost web hosts also provide Subversion servers.

It is not ideal though for managing general documents. Firstly it does get quite slow (especially for larger files) and changes can only be committed over a live network connection so it’s no good for disconnected work. Secondly, I found it very sensitive to how it was used and I’ve often managed to get my repository in a mess that was very hard and very time consuming to recover from. This is not acceptable in a system that you have to rely on. I’m sure its fine for its original purpose of source control but it is not so good for managing day-to-day work.

Hg LogoNext I looked at Distributed Version Control Systems. The most popular of these (the free ones anyway) are: Git, Mercurial and Bazaar. Mercurial seemed to be the one best developed for Windows so I tried that. It does seem reasonable but it seems to balk quite a bit at large’ish files (a few Mb, it seems that the Windows interface at least hangs quite often at least on Windows 7) and that makes it unsuitable for our needs. Git seems to have a lot going for it but is not so well developed for Windows and is rather more complex, I haven’t tried it at least yet. So that leaves Bazaar. I’m now using Bazaar in anger and I’m quite impressed. It seems to handle large files sensibly, it is easy to set up (really easy), it doesn’t complain when you move files and folders round, it doesn’t get in the way, it’s reasonably fast. You can also use it with a central repository too like SVN.

Bzr LogoBazaar comes with integration to Windows Explorer but you will probably want to look at the command line options too for automation.

I’ve set up a schedule that runs a commit of my main repository “workdocs” every morning, noon and afternoon (9am, 12pm and 4:30pm) and the extra data that is kept in the repository is simply backed up as part of the regular backup since it is just a hidden folder in the root of the “workdocs” folder.

I then commit changes manually as and when I want to after making significant changes to files.

For really critical files, you could couple Bazaar with the file change detection of SyncBackSE to automatically commit changes or you could use AutoHotKey to intercept the <ctrl>-s key combination to run a commit before or after doing a save.


Technorati : , , ,
Diigo Tag Search : , , ,

Sun’s VirtualBox gets on with it!

Yep, I keep being amazed by the quality of VirtualBox which is now owned by Sun.

I need to set up a virtual machine to test and demo Sun’s Identity Management (IdM) suite and it needs to be usable with VMware too. So I headed over to the VMware Appliances web site and downloaded a pre-canned Debian 5 server.

This is recognised fine by VirtualBox! I gave the VM a Host Networked connection to the network and with no further configuration, fired up the VM. First thing was to install some additional components so I used the Debian package manager (aptitude) from the command line (no windowing GUI here!) to install the file and database and web server virtual packages. It just worked, no networking problems at all and being a Host network, it is on my local LAN as well as the Internet with no problems.

It’s nice when things “just work”. That’s how it should be!

Of course, it probably wouldn’t have been quite so simple if I wanted a desktop as well. But there are also a number of pre-canned VirtualBox VM’s for downloading.

VirtualBoxImages and HelpDeskLive.


Technorati : , , ,
Diigo Tag Search : , , ,

Update

Hi, thought I’d better put an update on here as to why I haven’t done any posts here recently.

Well, I’ve not done much with Linux recently. My OpenSUSE 11.0 desktop machine works and does pretty much everything I ask of it. At the moment, that is largely managing my photographs and not much else. This is because I’m out and about on a big project and so I’m using my monster laptop (Dell M1710) and that is running Vista as I have to be able to run Outlook, OneNote and other MS Office applications at full speed.

So, check out my general IT Blog: “Much Ado About IT”.

Shell script to Back up critical files (using RSYNC)

Following up from my article on backing up USB drives, this recipe backs up the critical files on my desktop to remote storage (a NAS device on my network). Note that PC2 is the desktop to be backed up, SLUG1 (192.168.1.2) is the NAS device and USER1 is the user id doing the backup.

#!/bin/bash

# Backup Key PC2 files to Slug1

# Sync 2007 picture folders
##rsync -rl /home/user1/pictures/2007/  [email protected]:/public/pictures/2007/

# Ensure that /mnt/slug1-root/ is mounted
#if [ ! -e /mnt/slug1-root/user1/backups/PC2/bin/ ]; then
#    mount-slug-root.sh
#fi
# Ensure that /media/slug1-public/ is mounted
#if [ ! -e /media/slug1-public/DISK1.txt ]; then
#    mount-slug-public.sh
#fi

# NOTE that to configure the rsync sessions on SLUG1, edit the file /opt/etc/rsyncd.conf
TOPUB='[email protected]::public'
# Use this form if not using sessions
#TOPUB='[email protected]:/public'
# Or use this form if the remote folder is mounted locally
#TOPUB='/media/slug1-public'
TOJK='[email protected]::pc2'
#TOJK='[email protected]:/user1/backups/PC2'
#TOJK='/mnt/slug1-root/user1/backups/PC2'

JKDT=`date --rfc-3339=date`
JKLOG="/home/user1/Backups/pc2backup_$JKDT.log"

echo "Starting PC2 backup at `date`" >$JKLOG
echo "=================================================================="
echo "Starting PC2 backup at `date`"
echo "The log file is at $JKLOG, all backups are to SLUG1/pc2 or SLUG1/public"
echo " "

#--out-format=FORMAT     output updates using the specified FORMAT
#--log-file=FILE         log what we're doing to the specified FILE
#--chmod=CHMOD
#--exclude=PATTERN       exclude files matching PATTERN
#     --exclude-from=FILE     read exclude patterns from FILE
#     --include=PATTERN       don't exclude files matching PATTERN
#     --include-from=FILE
#--dry-run
#OPTS='--verbose --archive --recursive --links --perms --executability --owner --group --devices --specials --times --human-readable --delete --delete-after --stats --ipv4 --progress --password-file=/home/user1/bin/tmppw.tmp --dry-run'
OPTS='--verbose --archive --recursive --links --executability --devices --specials --times --human-readable --delete --delete-after --stats --ipv4 --progress'
echo "Back up various bits - WARNING: DELETES files from destination" >>$JKLOG

RSYNC_PASSWORD=`kdialog --password "Password for [email protected] please:"`
#kdialog --password "Password for [email protected] please:" >~/tmppw.tmp

echo "Backups to SLUG1/pc2"
echo " "
# ** JK BACKUPS **
echo "user1/bin"
echo "rsync $OPTS /home/user1/bin/ $TOJK/bin/" >>$JKLOG
rsync $OPTS /home/user1/bin/ $TOJK/bin/ >>$JKLOG 2>&amp;1
echo "=========================================" >>$JKLOG
echo "user1/backups"
echo "rsync $OPTS /home/user1/Backups/ $TOJK/Backups/" >>$JKLOG
rsync $OPTS /home/user1/Backups/ $TOJK/Backups/ >>$JKLOG 2>&amp;1
echo "=========================================" >>$JKLOG
#echo "rsync $OPTS /home/user1/Downloads/ $TOJK/Downloads/" >>$JKLOG
#rsync $OPTS /home/user1/Downloads/ $TOJK/Downloads/ >>$JKLOG 2>&amp;1
#echo "=========================================" >>$JKLOG

echo "Backups to SLUG1/public"
echo " "
# ** Backups to public **

echo "user1/ebooks"
echo "rsync $OPTS /home/user1/eBooks/ $TOPUB/ebooks/sorting/" >>$JKLOG
rsync $OPTS /home/user1/eBooks/ $TOPUB/ebooks/sorting/ >>$JKLOG 2>&amp;1
echo "=========================================" >>$JKLOG

echo "user1/pictures/Lnnnn"
echo "Back up picture files - WARNING: Does NOT delete files from destination" >>$JKLOG
OPTS='--verbose --archive --recursive --links --times --human-readable --stats --ipv4'
echo "rsync $OPTS /home/user1/Pictures/L2007/ $TOPUB/pictures/2007/" >>$JKLOG
rsync $OPTS /home/user1/Pictures/L2007/ $TOPUB/pictures/2007/ >>$JKLOG 2>&amp;1
echo "rsync $OPTS /home/user1/Pictures/L2008/ $TOPUB/pictures/2008/" >>$JKLOG
rsync $OPTS /home/user1/Pictures/L2008/ $TOPUB/pictures/2008/ >>$JKLOG 2>&amp;1
echo "=========================================" >>$JKLOG

echo "user1/backups/usbpen1 &amp; usbpen2"
echo "rsync $OPTS /home/user1/Backups/USBPEN1/ $TOJK/Backups/USBPEN1/" >>$JKLOG
rsync $OPTS /home/user1/Backups/USBPEN1/ $TOJK/Backups/USBPEN1/ >>$JKLOG 2>&amp;1
echo "=========================================" >>$JKLOG
echo "rsync $OPTS /home/user1/Backups/USBPEN2/ $TOJK/Backups/USBPEN2/" >>$JKLOG
rsync $OPTS /home/user1/Backups/USBPEN2/ $TOJK/Backups/USBPEN2/ >>$JKLOG 2>&amp;1
echo "=========================================" >>$JKLOG

echo "Google Earth places"
echo "Back up Google Earth myplaces.kml" >>$JKLOG
rsync $OPTS /home/user1/.googleearth/myplaces.kml $TOPUB/maps+walks/pc2-myplaces.kml >>$JKLOG 2>&amp;1
echo "=========================================" >>$JKLOG

#echo '' >~/tmppw.tmp
#rm ~/tmppw.tmp

echo " "
echo "ENDING PC2 backup at `date`" >>$JKLOG
echo "ENDING PC2 backup at `date`"
echo "=================================================================="

# To run under schedule
#    Log
#    Replace password

I have a similar script that runs on the NAS device which backs key files on that to a remote hosting service on a different continent! That way, I don’t have to worry about the house burning down or being burgled.

Automatically Backing up a USB Drive with RSYNC (KDE)

USB Drives of all kinds need to be backed up and the best backup is an automatic one (it’s the only way to make sure that it gets done!).

So here is one recipe for doing just that using RSYNC and some BASH scripting magic.

I’ve split this into two files. You don’t have to do this of course and one may well be better for you. I used two because I can run the second one manually as well. Put everything in autorun.sh if you want to backup each drive individually, however, note that KDE produces an annoying extra dialog (a security warning) asking if you really want to run the autorun.

  • autorun.sh
    This resides in the root of the USB drive and is executed automatically by KDE when the drive is detected (though not if the drive is attached when booting)
  • usb-backup-manual.sh
    This is a bit of a nasty hack, I have manually configured a list of drives that might be attached so that I can back them all up together. Not elegant but it works for me.

autorun.sh

#!/bin/bash

# KDE will automatically run an executable file called: .autorun, autorun or autorun.sh (in that order)
# Alternatively, a non-executable file called .autoopen or autoopen can contain a file name
# of a non-executable file on the media which will be opened with the default app for that file.
# See: http://standards.freedesktop.org/autostart-spec/autostart-spec-0.5.html#mounting

# Also see: http://b50.roxor.pl/~michal/linux/autorun.txt
# for some interesting ideas

# Where are we running from? e.g. /media/usbpen1
mediaDir=$(echo $0|sed 's/autorun//')

kdialog --title "USB Drive Backup" --yesno "I'd like to backup the USB drives, can I?"
if [ $? = 0 ]; then
echo " OK Selected, I'm going"
echo "Autobackup run: `date`" >usb-linux-auto-backup.log
exec ~/bin/usb-backup-manual.sh
else
echo " Cancel selected, so do nothing - bye."
fi
</code></pre><h4>usb-backup-manual.sh</h4><pre><code>
#! /bin/bash

#http://www.sanitarium.net/golug/rsync_backups.html
#http://www.mikerubel.org/computers/rsync_snapshots/
#http://rsync.samba.org/examples.html

echo "Starting USB Backup: `date`"
echo "Starting USB Backup: `date`" >~/Backups/usb-backup-manual.log

# From
MNT="/media"
# To
TO="/home/julian/Backups"

dcopRef=`kdialog --progressbar "Starting backup - press cancel to stop further processing (no next step)" 4`
dcop $dcopRef showCancelButton true

#until test "true" == `dcop $dcopRef wasCancelled`; do
for f in "CF2G1" "SD1G1" "USBPEN1" "USBPEN2"
do
dcop $dcopRef setLabel "Backing up $MNT/$f  ==>  $TO"
echo "--------------------------------------"
echo "$f  ==>  $TO"
inc=$((`dcop $dcopRef progress` + 1))
sleep 2
if [ -e $MNT/$f ]; then
  dcop $dcopRef setProgress $inc
  RSCMD="rsync --recursive --times --delete-during --stats --human-readable -h $MNT/$f $TO"
  echo $RSCMD
  echo $RSCMD  >>~/Backups/usb-backup-manual.log
  $RSCMD
  dcop $dcopRef setLabel "RSYNC for $f finished"
else
  dcop $dcopRef setProgress $inc
  dcop $dcopRef setLabel "$MNT/$f not mounted"
  echo "$MNT/$f not mounted"
  echo "$MNT/$f not mounted"  >>~/Backups/usb-backup-manual.log
fi
echo "======================================="
sleep 2
done

dcop $dcopRef close

echo "End: `date`"
echo "End: `date`" >>~/Backups/usb-backup-manual.log

Note the use of KDialog to provide a minimal GUI. In the second file, KDialog produces a progress bar.

Also note the RSYNC parameters. These are always painful to get to grips with so it is nice to have an example to work from. In this case I am backing up so I am making sure that the backup is an exact copy of the original (as opposed to synchronising which would allow changes to happen on either side).

Font sizes and DPI

This seems to be a problem that won’t go away. It seems inordinately hard to get a good looking set of fonts of the correct size. It is not that there aren’t some nice fonts available; there are, at last, some fonts under Linux that often look superior to the Microsoft ones. It’s just that it is difficult to get the whole look and feel correct.
This is especially true when mixing Gnome based applications (Firefox and Thunderbird for example) and KDE. OpenOffice also refuses to play nicely.
Anyway, grumping over, there is an excellent article on the Mozilla site about how to improve some of this by getting the correct DPI settings for your monitor (this is especially noticeable on my 24″ beast!)
The article is here.

Thoughts on OpenSUSE 11.0

Here are my experiences installing OpenSUSE 11.0 on my desktop PC (I had already successfully installed it on a VM). I opted for a KDE 3 desktop – I don’t like Gnome especially and KDE 4 is not ready for day-to-day use as far as I am concerned.

  • No problems at all with mixed IDE/SATA drives and GRUB 😉
  • Usual problems with NVidia drivers (corrupt screen on first entry to KDE). But this time, I could boot into safe mode, add the NVidia repository, install the drivers and restart. Much easier than previously if still not quite perfect.
  • I did have some problems setting up two screens this time but it is the first time I’ve had my big monitor (24″) at install time – I had to fiddle with the settings in the standard screen settings tool before I could get the NVidia settings tool to correctly recognise the size of the smaller screen.
  • I have a small issue with the NVidia drivers. I think that there is an issue with the latest drivers, I get an annoying screen blank every now and then. It is most noticable with some JavaScript enhanced web sites under FireFox for some odd reason. Under OpenSUSE 10.3, this was crashing KDE (which is why I got round to installing 11!)
    UPDATE 2008-07-17: This may, in the end, have been a hardware issue – I reseated the cables and everything is stable at the moment
  • YAST gets better and better. This is where you really see the benefits of being backed by a professional organisation (Novell).
  • You still can’t set up a network bridge in YAST though 🙁
    However, it is easy if you follow the instructions in my previous blog entry.
  • Everything seems a bit faster though that might be down to a fresh install?
  • One thing that is massively faster is installation and update of packages – Phew! At last, one of the biggest issues with SUSE has finally been cracked. It is now very fast indeed.
  • The extra back/forward buttons on my Logitech mouse work without any additional configuration – nice touch! Though the left/right scroll still doesn’t work 🙁
    UPDATE 2008-07-20: Ah ha! This one is due to an oddity in the key-mappings of the Logitech mouse that I use. Hopefully, I’ll now be able to sort this out when I get a chance
  • There is still a bug in YAST that drops the default router at random. this stops Host Networking from working under VirtualBox