Picking Office 365 is the easy part — now comes migration

Microsoft’s J. Peter Bruzzese writes about the challenges you’re likely to face once you’ve decided to migrate email to Office 365.

So you’ve decided to make the move to Office 365. Now you face big questions. How do you get your email data into Office 365? Do you even want to move all your email data into it? Just how do you ingest the data while ensuring the data remains unchanged from a compliance perspective? Have you analyzed where your data currently resides? Is it in Exchange or another email system? In a legacy archive? In PSTs?

Read on: Picking Office 365 is the easy part — now comes migration.

Posted in Email and Archive Migration

To find or be fined? That is the question.

What do human resources teams, internal audit teams, investigators and regulators have in common?

When an employee makes an allegation against a company, the HR team often needs to trawl through troves of cluttered archives to find relevant emails.

When an organization receives regulatory requests for information, the internal audit team has to comb through sizable data stores to identify and review the pertinent details.

In the event of litigation, an organization’s legal department will also find itself with the pressing need to secure particular pieces of evidence kept in large, unstructured databases. In each instance, there is a strategic advantage to finding the data as soon as possible.

What’s striking is that, although each department uses different terminology, their requirements have a lot of similarities. HR, legal and auditors need a quick, accurate way to find specific information amid vast unstructured data sets, to avoid being fined or facing legal damages.

Nuix can help. Our software lets you process unstructured data kept in complex storage containers to create an index that’s fully functional, easy to use, and lets you account for every piece of your data. At Nuix, we call this a “living index.”

With all of your data pre-processed, pre-culled and ready for search, eDiscovery and regulatory requests are much easier to deal with. Moreover, the powerful, patented Nuix Engine can process and structure data faster than any other technology on the planet, which is helpful when you need to locate certain pieces of information very quickly.

Right now you may be wondering what on earth your internal teams have in common with external forces of nature such as regulators and investigators. Ah yes, well spotted.

You may not know it, but most regulatory bodies and many litigation service providers worldwide also use Nuix to find information hidden in large, complex collections of unstructured data. This lends an interesting additional benefit to using Nuix.

US Securities and Exchange Commission building

When the regulator calls, make sure you can respond quickly and accurately. Photo: John M

Recently, Nuix gained a major UK bank as a client. The bank decided to invest in Nuix eDiscovery to cope with large, frequent and detailed regulatory requests. Our software reduced the bank’s average response time from weeks to hours.

Once the regulator—also using Nuix—noticed a pattern of diminished response times, it reduced the scale of its requests for information. It now only asks the bank intermittently for small, pointed pieces of data, as it knows how quickly and accurately it can expect to receive a response.

With the help of our software, the bank not only improved its response times, but also lessened the regulatory scrutiny it was subject to.

Nuix serves a simple purpose for enterprises: We help you find and remediate sensitive information before you have to deal with prosecutors’ damages or fines from regulators.

To find or be fined… which would you prefer?

Posted in Digital Investigation, eDiscovery, Information Governance

Don’t skip the jump lists!

From discussions I have had recently, it seems a lot of people—even seasoned investigators—don’t understand Microsoft Windows jump lists. Now that Nuix 5.2 has been released, I expect this will change very quickly.

Jump lists have been with us since the launch of Windows 7 in July 2009 and are becoming more prevalent in Windows 8. Over time, software vendors have grown to rely on jump lists, more than the Windows Registry, for their most recently used (MRU) and most frequently used (MFU) lists. As a result, I often gain more information about user behaviour from jump lists than from MRU and MFU entries in the Registry.

Anyone who has undertaken link file analysis will know that the evidence drawn from this work is invaluable. These artifacts are evidence about a user’s behavior and access history to files and resources within the computer and sometimes, more importantly, externally. I would argue that this is the single most reliable method for evidencing user activity and behavior and any forensic examination should include this analysis.

So why doesn’t everybody do it? Is it just that investigators aren’t fully aware of jump lists or that our core forensic applications don’t make analysis easy?

Jump lists in detail

So let’s take a look at these artifacts. They are located within a user’s profile, so the information is specific to individual accounts. They can be found in two forms:

  • Windows creates and populates automatic lists as the user engages with the system. Each time the user launches an application, opens a file or accesses a remote resource, this activity is recorded into the automatic jump lists. (*.automaticDestinations-ms files located at %UserProfile%\AppData\Roaming\Microsoft\Windows\Recent\AutomaticDestinations)
  • Custom jump lists are created when a user “pins” an application to the windows task bar. From there, a custom list works the same as an automatic list as the user operates that function. (*.customDestinations-ms files located in %UserProfile%\AppData\Roaming\Microsoft\Windows\Recent\CustomDestinations)

Take a look at these destinations and you’ll see many jump lists that look something like 1bc392b8e104a00e.automaticDestinations-ms. The string of characters at the start are an application ID or APPID that identifies a specific application and in most cases the version of that application. In this case, 1bc392b8e104a00e refers to Windows Remote Desktop. (You can find a comprehensive list of these at Forensicswiki.) Knowing this, we can target our investigation to the use of specific applications.

These files are in a compound binary format, containing numbered streams that follow the MS-SHELL format. In other words, they hold substantial link file information. The trouble is, we have few tools able to parse these files or indeed target specific information within them.

Jump lists in Nuix 5.2

In our latest release, Nuix 5.2, investigators can quickly access and review the data held in jump lists. Using Nuix’s dynamic metadata view, you can explode information from multiple jump lists into an interactive and useable table format.

In the example here, we have specified a jump list associated with Remote Desktop using the unique APPID, and then used the dynamic metadata function to display the information within the jump list.

Analyzing jump lists in Nuix 5.2

Analyzing jump lists in Nuix 5.2.

From these simple steps, we can extract the date and time of the connection, the command line arguments and the destination IP address. (In this example, we’ve only shown two fields however Nuix can present all available fields and information if you choose.)

As you can see, the content of a jump list depends on which application it refers to. A jump list for a picture viewer or word processing application might contain a list of files locations and access times.

Jump in!

As organizations and individuals adopt Windows 7 and 8, especially now that Microsoft has ended support for Windows XP, we can expect to find jump lists in nearly all of our cases. Investigators simply must use jump lists to understand who , when, where and what happened—and this can quickly lead us to “how.”

Nuix 5.2 has brought jump list analysis into its core Investigator applications and made these queries quick and easy to access. We no longer need to hop over or skip the valuable evidence that can be found in jump lists.

Posted in Digital Investigation

The 5 biggest legacy data problems you’ll encounter migrating to Office 365

I’ve died and gone to heaven. Well, thankfully I’m not actually dead but I am definitely in heaven. The Microsoft Exchange Conference (MEC) is happening in Austin, Texas this week. MEC is a gathering of nearly 2,000 professionals who spend their lives working with email. My love for email is well documented so it should come as no surprise to you that I’m here in Austin soaking up everything Microsoft has to offer and commiserating with my fellow email-people.

Nuix booth at MEC 2014

MEC attendees at the Nuix booth … but are they as excited as Rocco? Photo: Alexis Robbins

Of all of the hot topics I’ve heard discussed so far this week, none is hotter than Office 365. People are asking, what are the benefits of migrating our email to the cloud? Is it right for our organization? What happens to all the third-party applications we plug into Exchange now that it’s hosted in the cloud?

Most of the companies I’ve spoken to here in the last 24 hours have already decided to move email to the cloud either now or in the immediate future, so they’re focused on planning their migration. Almost always, the hardest part of migrating to the cloud is dealing with legacy enterprise email archives and personal email archives.

I’ve been helping companies make this transition since the days of Microsoft’s BPOS and this is what Nuix’s Intelligent Migration business is focused on. There are five big problems I see organizations encounter again and again.

  1. Legacy archive APIs. Email archiving guru Michael Lappin explains all of the reasons you shouldn’t use your legacy archive’s built-in application programming interface (API) to extract data. In short, APIs slow you down and cause your healthy data to become corrupted while you’re extracting it. About a third of the migration projects we have taken on in the past two years have been to replace an API-based archive extraction tool for this very reason. Our patented engine skips the API and uses binary extraction to rapidly migrate your data with full chain of custody.
  2. Exchange Web Services (EWS) throttling. Microsoft generously offers unlimited storage with your Office 365 archives. However, you need to be aware that EWS only allows you to bring in your legacy data at a rate of around 400 gigabytes per day. What do you do if you have 20, 50 or 100 terabytes of legacy data in an enterprise archive like Enterprise Vault or even in user PSTs? The trick is to maximize your 400 GB daily allotment with as few resources as possible. Other migration technologies make only one connection per migration server. This means they may need 10, 20 ore more servers to ingest 400 GB in 24 hours. Nuix makes up to 32 connections with a single migration server, allowing you max out your 400 gigabytes per day with just one machine.
  3. Internet throttling. You’re thinking to yourself right now that you would LOVE to get 400 gigabytes per day of legacy data in to your new Office 365 account. You only wish your internet link would support that much data transfer. Lack of bandwidth is definitely a big issue for a lot of organizations. To help you bypass your slow internet connection Nuix leverages Microsoft’s other cloud platform, Azure, to get your data uploaded to the cloud at the speed of copy and paste. We rapidly stage your data in to Azure, bypassing your slow internet connection, then migrate it in to Office 365 via Exchange Web Services at the max rate of 400 GB per day.
  4. Non-Exchange data formats. What do you do if your Enterprise Vault, SourceOne or Autonomy archive contains data from a legacy Lotus Notes or GroupWise email system? Traditional archive migration tools either don’t support you or require you to go through a three-step process of extracting, converting then ingesting your data. Nuix Intelligent Migration automatically converts your non-Exchange data in to an Exchange format compatible with Office 365 in one step. What if you have EMLs, EDBs, MBOXes from Google, NSFs or some other email data file type that can’t be ingested in to Office 365? Nuix converts and migrates those as well.
  5. Journal archives. Have you been journaling data from your legacy email system into your enterprise archive? Your single-instanced journal archive isn’t compatible with your new Office 365 mailbox archives. Traditional archive migration tools don’t convert journal archives to mailbox archives. How do you maintain this data for compliance and eDiscovery purposes? Nuix Intelligent Migration automatically converts journal archives in to Office 365 mailbox archives by “rehydrating” a copy of every message for every message owner in to the right mailbox archive.

If you’re planning to move your email to Office 365 or even on-premise Exchange 2013, these problems may sound familiar.

Nuix Intelligent Migration soft toys

Archive troubles? Nuix Intelligent Migration makes pigs fly. Photo: Alexis Robbins

Nuix loves email people and we would love to help you get to your new Exchange environment faster and with fewer headaches. If you’re at MEC, stop by booth #207. If not, get in touch!

Posted in Email and Archive Migration

Your eDiscovery skills have many applications

I decided to start my first blog off for Nuix with an idea that I think many Nuix users and other eDiscovery professionals take for granted.

Last week, I attended an information industry event that discussed the future of information management. After the opening speeches, I was naturally drawn to some of the smaller discussions, basically anything labelled “discovery!”

The themes and ideas that kept coming out were stories I’ve heard from our side of the discovery fence for many years: involve legal early and often in data decisions; develop a tiered approached to culling your data; identify and remove duplicates; and maybe even try some newfangled thing called “near duplicates” (complete with the inverted quotes gesture). Oh, and don’t forget to use a bit of analytics on the remaining data to help you act.

As I looked around the room, I saw faces full of wonderment about these great new ideas and how they could be applied in their unstructured data stores. The room wasn’t full of lawyers; they were CIOs and information managers.

Data nerd

If you’re an experienced ESI specialist, chances are you can do a lot more with your skills.
Photo: Yang Jiang, NYC Media Lab

So here’s the news: the skills we have employed for years in eDiscovery, with a few twists, are now the latest trend in a proactive information management.

If you have honed your skills in proactive eDiscovery, utilising a deep analysis tool, you will innately understand the issues that come with unstructured data. Those skills have value around the organization.

You can provide reports and strategic advice to other areas of the business based on your expertise with unstructured data. You can assist with methodologies that will help ensure the business can meet regulatory demands and conduct internal investigations by applying your knowledge and skills in fast searching, review and production to disciplines such as records management.

Guess what? You’re an experienced ESI specialist who every day uses a tool that has its roots in digital forensics. Revel in applying your eDiscovery knowledge to the additional data it reveals and the issues you can identify early on that might be bad news for your clients.

data.path installation by Ryoji Ikeda

Opportunities abound if you know your way around unstructured data. Photo: r2hox

My advice is to take every opportunity you can get to expand your knowledge about how different environments can impact the way data is stored and therefore what information you see once it is indexed. You never know what opportunities it could lead to—for yourself and your business.

Posted in eDiscovery

I give up: TIFFs, you win

For years I have sworn hand on heart that Nuix would never create a quality control module because TIFFing was an unnecessarily expensive, cumbersome and soon-to-be-replaced form of document presentation.

Stacks of paper files in an abandoned hospital

TIFFs were an improvement over massive stacks of paper, but are they also obsolete? Photo: Timm Suess

For those readers who don’t work in the legal industry, TIFFing is the practice of taking all the documents you must disclose to the opposing party during litigation and converting them into TIFF files—in other words, bitmapped images of those documents. To turn these files into computer readable and searchable text, the other side must run each one through an optical character recognition engine.

It’s a practice that might have made sense more than a decade ago, when discovery mostly involved scans of paper documents. It’s a lot easier to hand over a million TIFFs than a million pieces of paper. But today it sounds crazy. Why not just hand over email messages, Word documents and the rest in their original format?

In fact, most jurisdictions around the world allow native production, which is much more efficient. But TIFFing, like urban myth chain emails and passionfruit vines, just won’t die.

Listening to customers is a fundamental part of what we do at Nuix. I have come to realize that no matter what I think or fervently wish, TIFFing is still a requirement for many organizations.

So I will eat my words, along with some humble pie.

When we release Nuix 5.2 next week, it will have a fully functional QC module. This will enable you to review all imaging and stamping and insert custom slip sheets before export.

Nuix QC module

Previewing TIFF production in Nuix’s QC module.

Of course, we bring the parallel processing power of the Nuix Engine to bear, so our production rates and dithering quality are second to none. And if there are any document types our settings don’t do justice to, you can open them natively within Nuix, print them as desired and have Nuix import them right back in place.

I hope this will finally put to rest any idea that Nuix is bad at TIFFs.

I apologize for breaking my promise, but TIFFs have won this battle. Nuix is getting on the bandwagon.


Posted in eDiscovery

Building international cybersecurity capacity with ITU-IMPACT

Cybercrime is an increasing problem faced on a global scale. That’s why it’s more important than ever for agencies to network across international boundaries to so they can pool resources and share expertise to tackle this challenge. I recently had the pleasure of conducting training sessions at a global workshop in Cyberjaya, Malaysia, which aimed to do just that.

Stuart Clarke delivers training at INTERPOL-ITU-IMPACT seminar

Stuart Clarke delivers cybersecruity training at an INTERPOL-ITU-IMPACT seminar in February. Photo: ITU-IMPACT

The event was organised by INTERPOL and our industry partner ITU-IMPACT, the cybersecurity executing arm of the United Nations. Its goal was to improve the cybercrime investigation capabilities of law enforcement agencies, National computer emergency response teams, regulators, and internet service providers in the ASEAN region.

More than 40 senior officials from 15 countries including Malaysia, Indonesia, Philippines, Singapore and Thailand attended the three-day event. My colleague Rob Attoe and I conducted a number of sessions alongside INTERPOL, ITU-IMPACT and Stroz Friedberg, an investigations, intelligence and risk management firm.

It was a huge pleasure to be part of such an important event and share our digital forensics knowledge with developing countries in the ASEAN region.

Here’s an interview I did at the event, on just why this work is so important and what we can do to tackle the problem of cybersecurity across international boundaries.

Posted in Cybersecurity

Get every new post delivered to your Inbox.