[Fixed] Server 2012 R2 Essentials – Client Restore Crashes

We’ve been running Server 2012 R2 Essentials on a Lenovo TS140 for some time now. Client backups were running nightly and all appeared to be just fine.

When attempting to restore files, right after selecting the volume you want to restore from the wizard would crash with the “Restore Files and Folders wizard has stopped working” dialog. Most annoying…

Restore Files and Folders Wizard – Crash Server 2012 R2 Essentials

There are a myriad of posts all over the web about uninstalling this or that KB update. I had none of those KB updates installed anyway.

It did turn out that Microsoft knows about this and also has a hotfix to correct the problem. Trouble is that they make it very hard to find and download. ie: they email you the link to the hotfix.

Here’s the link to the hotfix 3045682: “You cannot restore files and folders from Server Essentials Backup on a Windows-based computer”

“This article describes an issue that occurs when you restore files and folders in Windows 8.1, Windows Server 2012 R2, Windows 8, Windows Server 2012, Windows 7, or Windows Server 2008 R2. A hotfix is available to resolve this issue. Before you install this hotfix, check out the Prerequisites section.”

https://support.microsoft.com/en-us/kb/3045682
The “Hotfix Download Available” button points here: http://support.microsoft.com/hotfix/KBHotfix.aspx?kbnum=3045682&kbln=en-us

The Download link points to a page like this:

I checked that the prerequisites were all installed and once the hotfix was installed and the machine rebooted, I was able to successfully restore files again.

Posted in Computer Stuff, Windows Home Server | Tagged , | Leave a comment

Using the Repository Pattern with EntityFramework CodeFirst and ASP.Net MVC4

Intro  

Writing web applications has gotten vastly easier using ASP.NET MVC4 and Entity Framework, especially if you’re counting lines of code in your application layer as an indicator.  I’m a huge fan of EntityFramework 5 CodeFirst as a mechanism for creating and interacting with your database for line of business apps.  With that said, you should STILL design and normalize your database using good practices. EF code first simply allows you to  specify that database design using C# code in terms of the domain entities instead of SQL scripts.

If you follow the Microsoft script for ASP.NET MVC4 using Entity Framework for your models, you’ll end up with your database interaction logic crowding your Controllers and Views.

The trick to keeping your application lightweight and easy to change is in abstracting the database away into Repositories.  This lets you have your Controllers and Views talking in terms of simpler Domain objects while using an entirely abstract mechanism for handling the CRUD (Create-Retrieve-Update-Delete) operations.  Even Domain logic should be factored out whenever possible to keep the Controllers and views as void of decision logic as possible.

I’m still on the fence about the value of having a separating Unit-Of-Work and Repository when dealing with such short lived contexts as we have in MVC based Web applications.  I use both extensively in client side desktop applications where the repositories may live quite a while and have multiple operations performed against them before they go away.

Implementation of this one can range from simple to complex depending on how much you want or need to abstract.

Essentially A repository enables the following kind of conversation in your controller

  • Get me an instance of a Repository class suitable for this controller
  • Ask that Repository to get you a collection of simple domain objects(s) that your app can work with. ie:  Repository.GetAllCustomers().
  • Repository says “Here they are” as a collection of friendly POCOs, complete with navigation props etc.
  • Your controller sends this off to the View.
  • The View does stuff to these and says “Here, I’m done.”
  • The Controller says “Repository, put these back where you got them from.”  Or – “I created some new ones myself, Add these.”

The closer you can get to this simple level of conversation between your controllers, views and repositories, the easier it is to change your user interface as needs and requirements change.

In my ASP.NET MVC4 apps, I actually prefer NOT to even have ‘models’ in my web app.  Various repository classes should be serving customized mini-models that represent only the specific business areas of interest.  You want to manage the Cars owned by various Customers?  Then even though you have lots of other things in your database, create a repository that properly models only what’s needed for the UI to handle this task.

Ultimately that’s it.  The main goal is to have your Repository implement an interface that is simple enough to be easy to maintain yet rich enough to solve the functional needs of the controller and its Views. It should abstract away details of what sort of procedure is necessary to get/save db items and where they’re being saved.  It is perfectly ok to have your repository implement things that you might think of as complex queries.  It’s really “data or business layer in a box”.

Generally speaking here are the steps I take:

  • Define an IEFClientsRepository interface with the methods you want.  Ie:  Add(), Update(), Find(), FindMany(), ComputeXYZ(), GetCustomersWithBalancesDue(), PhoneHome(), Save(), Discard(), whatever.
  • Implement concrete MyEFClientsRepository : IEFClientsRepository and write all your methods to solve these problems. MyEFClientsRepository should mostly hold the kind of code you’d previously been putting in your Controller. For Entity Framework related repositories, this repository will hold an instance of your custom DbContext.
  • Then,  defer everything dbcontext related to custom methods that you add inside your custom DbContext class ie: ClientsDbContext.  Don’t put those directly inside your Repository.
  • So, for your ClientsDbContext that derives from DbContext, embellish that with all extra public methods you need for interacting specifically with the DbContext and managing the actual entity  state/validation, etc., generating complex queries etc…  ie: Actually does the “GetCustomersWithBalancesDue()” and returns that Linq IQueryable or the actual data collection.
  • Again, your MyEFClientsRepository will hold an instance of your ClientsDbContext which can get created at the time your Repository is constructed and goes away with the Repository.
  • The UnitOfWork pattern is essentially VERY SIMPLE to add.  It is simply the container that holds/manages your DbContext and has Save/Discard methods.  It’s not really necessary I think for ASP.Net MVC4 work because the contexts live for such a short time.  UnitOfWork is very useful for Client side Desktop apps where you might create a repository that sticks around as long as some UI module lives.  In this case, a New Unit of work is created each time you want to simulate a “transactional” step that might be discarded prior to saving.  So, your window / control /whatever has a Repository instance.  Any time you run some new command, you get a new Unit Of Work, do your job, then toss it.

A Repository is a storage/retrieval class implemented for some specific back end data store like EF, or XmlDb or whatever.  So if your app can have optional ways of storing data, you’d implement a Repository concrete class for each storage type, but they’d all share the common IRepository interface you defined.

Notes:  The most obvious reasons for creating custom IRepository interfaces are:

  • They allow you to transparently implement more than one type of backend storage system,
  • They allow testing your front end / middle layers without needing any backend storage system.
  • They ensure that your front end is only loosely coupled with your database.

I still think you can Skip the UnitOfWork and just have the repository manage an instance and lifetime of your ClientsDbContext for MVC web apps because the Repository gets recreated every time the controller is instantiated.

Now, if you introduce MEF into this picture, you can have MEF actually manage the lifetime for you and have it shared and live beyond a single controller instance.

Final NOTE:  I suggest creating an INTERFACE for your ClientsDbContext, ie:  IClientsDbContext and having your Repository work against this.  That allows you to test your app down to the Repository layer without needing an Actual DbContext.  You can implement a cheap fake DbContext that has static data.

That’s all for now, but I have alot more I’ll post on this topic as time permits…

Posted in Programming | Leave a comment

Pasting HTML into Microsoft Excel

I frequently find that I need to paste data from the web into Excel.  Sometimes that operation drags along a bunch of unwanted controls that are really difficult to remove.

image

The simplest way I know of to remove these is to copy the data to a new sheet in the workbook using Paste Special and then remove the original content.

  • Paste the original data into a worksheet.
  • Select/Copy all the content in the sheet with Ctrl-A and Ctrl-C.
  • Open a new Sheet in the workbook and Right-Mouse->Paste Special.  Select “Values” and click OK.
  • Go back and delete the previous sheet.

Maybe there are some easier ways, but this isn’t too bad.

image

image

Posted in Computer Stuff | Leave a comment

Replacing Word Doc Hyperlinks using OpenXML SDK

When moving from the CTP version of the Office OpenXML SDK to the released version, the API changed in many significant ways.  One thing that changed was moving from strongly typed attribute values to just strings. Another change I had recently struggled over was related to the handling of Hyperlink relationships.

In the CTP version, there was a fairly intuitive method pair appropriately named doc.documentPart.AddHyperlinkRelationship(…) .  This went hand in hand with doc.documentPart.DeleteHyperlinkRelationship(…).  So, naturally if you needed to replace one hyperlink with another, you could simply delete the existing one and add a new one with the same relationship ID and you were golden.

In the Released version of the Office OpenXML SDK, the DeleteHyperlinkRelationship() method went mysteriously missing.  The documentation is silent on this front.  There still exists the pair of methods: doc.documentPart.AddExternalRelationship(…) and doc.documentPart.DeleteExternalRelationship(…), so what gives?  There must have been some underlying reason in the API for removing the deleteExternalRelationship method, but since I wasn’t able to find any info on it, here’s how to replace an existing hyperlink relationship:  (obviously, you’ll want to find the one you’re replacing first, so you have the Id)

var mdp = doc.MainDocumentPart;
mdp.DeleteReferenceRelationship("rId11");
mdp.AddHyperlinkRelationship("http://www.microsoft.com","rId11");
Posted in OpenXML, Programming | Leave a comment

Office Excel 2010

Microsoft Office 2010 Excel Trust Center

Sigh…  I can’t decide whether to laugh or cry following an update to Microsoft Office 2010.  Let me say that I’ve always been pleased with the capabilities of Excel, but I can’t express how foolish it makes me feel when I struggle to do what should be simple.  The Office Excel 2010 Trust Center dialog is designed to confuse (or at least dispense with tradition and use inverted logic).

Ok, so I have this program I’ve been using for years that exports its data in XLS format.  I have always loaded this up in Excel and done analysis and saved the results, and all was good.

Now, Excel 2010 has decided that this document can only be opened in protected mode.  Fine, so I click the option that says allow editing.  Only I’m not allowed to save any of the document changes.  Instead I’m presented with this formal looking dialog box.

So, fine, I’ll just follow the rules and make sure that “Excel 2007 and later Workbooks and Templates” is set to allow it to “open and save”.  But apparently that’s not enough.  Because even though I’ve done this, it still won’t allow saving or upgrading to .xlxs, saving as .csv, copy/paste to a new worksheet and save.

After nearly 2 hours of searching the web, and following every foolish Microsoft Tech Note I could find on the subject, I can’t save, print or copy my data.  They mention changing registry settings to allow this, but there is no mention of just what to change.

UPDATE:

So the problem was the pilot, not the plane.  However, the plane was sadly designed for the freeway, not the airways.  Well, anyway it seems that I’m not the only one fooled by the UI.  For example, take a look at this dialog from the Excel Trust Center.  I really did think that all I needed to do was make sure Open and Save were checked.

You have a list of file types,  check boxes for Open and Save.  At the bottom, it says “Open behavior for selected file types” and says how to handle selected types.  Intuition says, simply check off the file types you want to allow to open and save. (Although I was puzzled why it would allow you to check save without checking open, and why there was no bullet choice for opening/saving selected types without using protected view).

Boy was I WRONG!!!  Reading the top paragraph carefully it says “By selecting Open, Excel BLOCKS this type of file or opens it in Protected View.  By selecting Save, Excel prevents saving this type of file”.  Huh?  What?  Negative logic?  So, you turn ON a check box to Turn Off the feature?  OMG.  Sorry, I don’t live in that parallel universe.  🙂

This is a poor design, but not because it doesn’t do what it says, but because it is counter intuitive.  If the “Open” and “Save” columns were titled “Prevent Opening” and “Prevent Saving”, then there would have never been any confusion.  Clearly however I didn’t read every word in the dialog completely the first time.  It SEEMED intuitive to use just looking a the dialog options, column titles, etc. so it’s my own fault, but I hate this dialog so much I unchecked everything in it.

Maybe if I weren’t a software developer this wouldn’t have fooled me.  Alas, a quick search on the Web shows that this is widely misinterpreted.

Microsoft, please correct this design.  While looking for a solution to this simple problem I kept reading suggestions to just go back to Office 2007 or going to Google Docs.  Simply because they didn’t interpret this dialog correctly and Office was doing exactly what it said, but not what you wanted.  I shared this agony for the better part of a sunny Saturday afternoon when I’d rather have been doing something else.

What’s really interesting is that this entire problem could have been eliminated if I had written this blog post first…

Posted in Computer Stuff | Tagged | Comments Off on Office Excel 2010

Deploying Silverlight 4 WCF Ria Services Application

Problems Deploying a Silverlight 4 application using WCF Ria Services on a  Windows Home Server (IIS 6)

This is not a comprehensive Todo post.  Just notes on what it took to get it going…

Well, that turned into a two weekend Time Sink.  So, the deployment of the SL RIA services web project is pretty straight forward from Visual Studio 2010.  Once copied up, the application was easy to get running, however getting Data connectivity through RIA Services was a different story.  These problems are just not debuggable without using Fiddler.

So, the application ran, but showed no data.  Fiddler was showing a 404 error accessing .svc queries.  That was first, because the .svc extension wasn’t mapped to aspnet_isapi.dll, and then because I hadn’t unchecked the “Verify that file exists” option.  After that, I the server started returning error 500 because I hadn’t allowed execute access to the subdirectories…

A number of individual things needed to be set just so in order for everything to work.  Many were obvious to me and some were not.   Figuring out what mattered and what didn’t for my application was really a chore.

  • The Virtual Directory needed execute permissions on Scripts and Executables set.  It is important that this extend to the bin, clientbin and aspnet_client folders as well.
  • The Application Extension Mappings needed “.svc” mapped to “C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\aspnet_isapi.dll” and the “Verify that file exists” option must be UNCHECKED because this Ria Services uses dynamic service file, so it really doesn’t exist.  Also ensure that “All Verbs” are enabled.
  • Obviously make sure that the default content page is appropriate in the “Documents” tab if you don’t want people having to type in your start page name.
  • In “Directory Security”, I enabled anonymous access and disabled everything else.  Need to revisit whether integrated windows authentication can be turned on in this case instead of anonymous access.
  • In the “ASP.NET” tab, make sure that the correct .NET framework version is selected.  I also removed all the connection strings except the one I wanted RIA services to use.  This shouldn’t matter as connection strings should be used by name anyway.
  • In the “HTTP Headers” tab, you need to add the following MIME types:
    • .xaml  ->  application/xaml+xml
    • .xap -> application/x-silverlight-app
    • .xbap -> application/x-ms-xbap
  • I also enabled Immediate Content Expiration in the “HTTP Headers” tab, but I don’t think I really needed or want this in the final deployment. 

Don’t forget the IISRESET to ensure that all these settings are active.

Posted in Computer Stuff, Programming, Windows Home Server | Comments Off on Deploying Silverlight 4 WCF Ria Services Application

Problem Syncing WHS User Passwords on Win7 x64 PCs

WHS Password Synchronization Problems

So I was just asked a question about how to change a user login password on WHS.  To recap, WHS requires user passwords to be the same between client PCs and the Server to allow transparent access to the server’s shared folders.

The process has always been simple.  Change the password on your local machine, log off, then back on again.  Next thing you see is that the WHS tray icon notifies you that the user password  does not match the one in WHS.  Clicking on that notification brings you to the Update Password form.

From here it’s a matter of choosing whether you want to keep the password set on this PC, or the one on the server knows about.  So, we choose the option to keep the password on this machine since we just changed it here.  You enter the current password you just set on this machine in the appropriate box, and the old (previous) password in the Server password box and click the OK button.  If you entered them without typos, you’re all set and now the PC and Server know the same password for your user account.

But WAIT,  on Win7 x64 systems, this doesn’t work!

Instead, it insists you’ve typed in the wrong password.  And after a few tries, it says your account is disabled.  Huh?  Well, it’s not really disabled, it just makes you wait a bit to try again.

The solution is not at all obvious to us plain folk.  I tried all manner of trying to get them to synchronize after that and occasionally could manage to get them sync’d, but after lots of changing passwords on the server and client, doing head stands, salt throwing, etc..  No dice…

One simple solution lies in changing the Local Network Security policy.

  1. In the start menu’s search bar, type:  SECPOL.MSC  and hit return.
  2. Find and Expand the Local Policies folder in the Left hand Pane.
  3. Select the Security Options folder under the Local Policies.
  4. Scroll down in the right hand pane and double click on this entry:   Network security: LAN Manager authentication level
  5. Set the authentication level to:  Send NTLM response only
  6. Click OK.   Now, you are free to change the password on your PC or the WHS console and the Update Password dialog should now work as expected.

Now, I haven’t investigated why this wasn’t necessary on other versions of windows, and perhaps it is, but I never ran into the problem until now.

    
Posted in Computer Stuff, Windows Home Server | Comments Off on Problem Syncing WHS User Passwords on Win7 x64 PCs

IE Zones, Group Policies, and Blocked Files…

So, I hear all the time from people close to me that they downloaded this and that file and it won’t open.  The standard solution is to Right Mouse->Properties then click “Unblock”.   But why does this happen and what can/should be done about it?

I’ll try to explain this for the lay-person, but I won’t go too deep technically.  Mostly because I stopped digging when I found the bone I was looking for.  It is important to note that this isn’t just an Internet Explorer related behavior, so Chrome/FireFox/ copying a file from a thumb drive, or across a local network, etc. users all can potentially have the same result.

When you look at the Internet Options settings in the control panel, the Security tab shows you 4 “Zones” shown below.  Whenever you access a website in your browser or a network file location, it falls into one of the 4 Zones.

  1. Internet
  2. Local intranet
  3. Trusted sites
  4. Restricted sites

When you download or copy a file from anywhere that isn’t physically on your own machine, the file is tagged as originating from one of these 4 Zones.

So, what does it really mean when they are ‘Blocked’?  It depends on the kind of file.  Executables (.exe, .dll, etc) may work fine, but they will be prevented from accessing certain system services.  So that application might ‘appear to work fine’ but might also fail to perform certain tasks that require trusted status.  So, it’ll seem like it has a bug!

MS Office documents downloaded will come in as ReadOnly and require you to specifically approve them before you can edit, print or in some cases even save them.

A Zip file originating from an untrusted site will tag all files that get extracted from it as blocked as well unless unblocked first.

Simply adding the website name to the “Trusted Sites” in the “Internet Settings” dialog will cause all files originating from that site to be unblocked by default.

Always think before unblocking a file.

Geek stuff below…

Your PC’s “Group Policy” settings are the way that Windows defines which behaviors are allowed and which are not.  In this case, consider a Zip file downloaded from a site in the “Internet” zone.   The default GP (Group Policy) says to record the “Internet” Zone information with the file.  The default GP settings also say to show the “Unblock” button on the file’s “Properties” form since the “Internet” zone is an “untrusted” Zone.

If you Unzip that file, every file that originates from an untrusted (blocked) file is also tagged with that Zone and therefore ‘Blocked’.  If you ‘Unblock’ the zip file, then all the extracted files will also be unblocked.  It gets really tedious to select each file, Right-Mouse->Properties then click Unblock.  Imagine doing this a hundred times so that you can access the files.  A .NET XAML solution or project will often not run properly until all the XAML files are unblocked!

It’s possible enable the “Do not preserve zone information in file attachments” policy, in which case, you never need to Unblock a file downloaded from an untrusted Zone.  This is a terrible idea and I do not recommend it.  It’s better to add sites you trust to the “Trusted Sites” zone for many reasons, this being one of them.  But if you must,  in the Local Group Policy Editor, these settings are in the Local Computer Policy ->User Configuration -> Administrative Templates -> Windows Components -> Attachment Manager.

There’s also an easy way to remove the Zone information from a whole bunch of files at one shot, even a whole hierarchy of files.  The solution is provided by Microsoft in a simple command line tool called “Streams” written by Mark Russinovich (of SysInternals fame).  Now, there can be several kinds of streams attached to a file, but the most common here is “:zone…” stream.

Download the “Streams” program from Microsoft ( http://technet.microsoft.com/sysinternals )  It’s a tiny command line application that only has 2 optional parameters:  -d (delete)  and -s (process all files in all the subfolders)

Examples:

  • streams -s -d myFolderOfBlockedFiles     <– this deletes all the alterenate streams from all files in the folder tree.
  • streams -d myBlockedFile                           <– this deletes all the alternate streams (including zone) from the file.
  • streams -s  myFolderOfBlockedFiles        <– this lists all the files with alternate ‘streams’ attached.

Create a text named “Unblock.cmd” and put this where you unzip the “streams.exe” program containing the following:

(path-to-streams.exe)\streams -s -d %1

Now all you need to do is type “Unblock <file or folder name>”  in a command shell if you need to unblock a file or bunch of files.

Posted in Computer Stuff | Comments Off on IE Zones, Group Policies, and Blocked Files…

Is Windows Home Server Dead or just changing?

Sometimes a company does something that makes you really really happy.  Microsoft did this when they delivered Windows Home Server.  HP did this when they released the HP MediaSmart Server.  I finally had a sweet solution to having ALL my home machines silently backed up every day, a highly available set of simple file shares for storing data, NO Drive letters to remember, the ability to add storage to a pool by just plugging in a new hot swappable drive, and selectable share duplication.  All this in with:

  • a very tiny box
  • ultra low power
  • hot-swappable drive bays
  • manages my printers
  • has GB networking
  • lets me to log in from the road and access my stuff
  • shuts itself off at night and turns back on in the morning
  • makes almost no noise sitting on my desk.
  • has lots of USB ports for charging stuff
  • runs my weather station
  • serves all my music and photos through iTunes, Windows Media Player, or to me over the web.

I should add that the user management in this system is so much simpler than managing a Server Domain.  Just add user accounts on the server that match the names of your LOCAL USER accounts on your home PCs and from that point on, the server takes care of syncing password changes.  There’s no need for having home users log into and manage domain accounts like you’d have to do with a real domain Server.

So, for the uninitiated, WHS used a pretty sweet technology referred to as their Drive Extender which essentially Pooled the storage disks into what looked like one huge disk. It protected your files by ensuring that all files exist on at least 2 physical drives at all times.  So, if a drive dies, you remove it, and plug in a new one whenever you get a chance and the DE takes care of all the rest.  I’d like a reliable DE replacement, not that I’m totally bent on being able to mix different size drives, but because it’s relatively painless.  RAID is not painless and needs experienced people to maintain them.

So what if the media streaming didn’t really work that well,  and the iTunes server used to work pretty well (not so much anymore), and the daily image backups were awesome in how they only required enough storage to account for the changed data.  Fortunately Windows Home Server 2011 doesn’t eliminate that…

Enter 2010, Microsoft was developing the next version internally named Vail which should have extended this awesomeness even further by basing the core on 64bit Server 2008 R2 instead of 32bit Server 2003.  I was so happy, thank you Microsoft.

Then apparently they Forgot The Whole Point of WHS and decided to remove the Drive Extender.  Hence, no more simple share duplication, no more Plug-in a new drive if you need more space in the pool, etc..  Now, they aren’t saying what’s going to replace it, but by default, it’s probably going to be back to Hardware RAID or just take your own chances about drive failure.  I felt a bit hosed by this.  I really wanted to upgrade and keep the purely Share based disk pooling system. Don’t get me wrong, I love Microsoft Technology, most of the time I just despise the decision making process there.  What irks me is the dishonesty here.  They said that this decision is a result of Customer and Partner feedback.  This is so not true.  They may have talked to their customers, but weren’t listening as there is simply no way the WHS end users said anything even remotely like “it’s ok to go ahead and get rid of DE, we hate it”.

So, Now I am unhappy.  I won’t want to be switching to WHS without some form of DE replacement.  Sure, it doesn’t HAVE to be DE, but does HAVE to be reliable and it needs to be coming from Microsoft.  That’s unfortunately not going to happen.  DriveBender does appear to solve this problem hopefully.

I’m going to be much more cautious about WHS promotion moving forward, but I still think it was one of the most underrated products in the past few years.  Shame on HP for not pushing back on the on the WHS Team and for discontinuing your great little media server boxes.  Granted, the did continue to produce the HP Proliant Microserver line, but these are a little larger, louder, eat a bit more power and don’t have hot-swap bays.

Where to now?

Well, what options do I have.  Well, I could just buy another EX495 as a backup and keep using WHS v1 for the next 5 or 6 years as is.

I could buy a Drobo box for a ton more coin, but this only solves part of the data protection problem and requires me to use up tons of disk space since you can’t selectively enable duplication like you used to be able to do with DE. Drobo doesn’t even come close to providing anything like the super-easy automated network-wide system backups.  Remember, WHS uses a fraction of the disk space any other backup system does because of the sharing/reusing of backed up file clusters.  ie: a ‘file’ backed up on one machine is shared across all backups of all machines and future backups if it’s identical.  Some backup programs manage something like using incremental backups of a single machine, but not shared across the whole home network of machines!  My server has over 100 complete backups instantly browsable for our 7 PCs over a period of 3 years and occupies less than 900gb.  Drobo can’t help here and is a Linux based platform, so you pretty much have to be a geek to get apps running/configured there, although it’s not terrible.  My mom won’t be doing this anytime soon however, so it’s not yet simple enough.

I could use Drobo as the file system and WHS on the EX495, but now I have two boxes twice the power, more than twice the noise/heat, etc.. no thanks, not plug/play like WHS V1.

I suppose sometime before the Ex495 dies I’ll build a new quiet low power machine tuned for this task and upgrade to Windows Storage Server 2012.  We’ll see what the future holds…

Posted in Computer Stuff, Windows Home Server | Leave a comment

My Next Guitar…?

I’ve been drooling over this custom BTO Taylor acoustic/electric guitar for about a year 4 years now.  Maybe I’ll save my pennies and have Taylor Guitars build me one.  Well, that’s after the house is painted, the roof replaced and the driveway sealed…  ok, well nevermind…  🙂

Taylor BTO Grand Auditorium with Sinker Redwood Top

Posted in Music | Comments Off on My Next Guitar…?