Challenges Of ISO 17025 Accreditation – Survey Results

A group of forensic practitioners has recently conducted a survey into the ISO 17025 scheme – its effectiveness, and its relevance for digital forensics.

The survey was conducted using Google Forms and was aimed at forensic practitioners on the front line of investigation, rather than at managers or corporations. What follows below is a summary of the results.

The majority of the respondents were investigators in law enforcement, followed by those in the private sector. Some of the private sector respondents worked alongside law enforcement on some investigations, while others did not.

Most provided a range of different services, with the most popular being hard disk and storage media imaging; analysis of stored data; phone contents preservation and analysis; server analysis; and network investigations analysis.

When it came to organisation type, 65.3% of respondents worked in an organisation within law enforcement, and 18.8% worked in a specialist digital forensics firm with permanent staff. The remaining 15.9% were split between general forensic science laboratories, specialist digital forensics businesses run on cooperative lines, sole traders, and ‘other’.


Get The Latest DFIR News

Join the Forensic Focus newsletter for the best DFIR articles in your inbox every month.


Unsubscribe any time. We respect your privacy - read our privacy policy.

Only 11.4% of individuals surveyed reported that their businesses were already accredited, with almost half of respondents saying they were in the process of preparing for accreditation.

Of those whose organisations were currently using an industry standard, most were using ISO 17025, with ISO 9000 coming in second. One third of respondents said their organisaitons were not using any standards at present.

Of those who were using standards, most focused on a few core competencies:

  • Adequacy of training
  • Qualifications
  • Verification
  • Validation
  • Compliance with established Good Practice guide
  • Written standard procedures

Most people reported having a ‘reasonably good’ understanding of ISO 17025, although a quarter of respondents said their understanding was poor, and a small proportion claimed not to understand it at all.

There was then space in the survey for respondents to add further comments about ISO 17025 requirements. Some of these responses included:

“The mass duplication of effort involved in every digital forensic lab in the country simultaneously and independently preparing for accreditation seems an unconscionable waste of public man-hours at a time of austerity. Why one single UK-wide body couldn’t have been given the task of validating forensic tools, which could then be rolled out to all labs, is beyond me.”

“I’ve not ticked any of the above boxes because I believe they shouldn’t apply. I’ve worked in a previous high tech unit that was pursuing ISO17025 as a whole, (rather than just imaging) and have considerable experience of it. I strongly believe ISO17025 will offer no guarantees of quality in this field. Whenever evidence is challenged it is never to do with written procedures, methods of acquisition, tools used to examine data and validation of processes, equipment and software. It is always to do with interpretation of that evidence. Just because there is a challenge doesn’t mean the interpretation by the prosecution was wrong.”

“I do not feel that this adds any value to the results produced by my department. It is just unnecessary red tape with no benefit. Attempts to meet validation criteria have resulted in 2 staff working full time to meet accredition, and only 2 left to do the examinations. Meeting validation requirements appears to involve jumping through hoops with no added value in terms of quality of process. We are currently only looking at imaging of hard drives. I do not feel that validation of phone examination processes is possible in any meaningful way. There are too many different phones, too many variations in software, and far too many apps – all of which again have different versions. It is only possible to validate this by simplifying the secenario to the point that it becomes completely meaningless – eg validate a phone model that only does calls and SMS, when in reality most people use smart phones with multiple apps. As with all application of real-world things to a computer environment, there are significant dificulties in translation. Most of the principles of 17025 from wet-laboratories to digital are lost in translation. There is a lack of understanding of the differences between computer science and the real-world science on the part of the FSR. This lack of understanding begins at the use of 17025 as a standard for a digital lab in the first place, and continues through the application of its standards.”

Other commonly cited concerns included the costs involved, particularly for small businesses and individual practitioners, and the view that ISO 17025 does not apply to digital forensics because wet forensic standards are not relevant in digital forensic investigations.

One huge concern from a number of participants was the time involved with accreditation and implementation. Backlogs are one of the main challenges of digital forensic investigations even now, and there was a high level of concern that this would only increase.

However, not everyone felt negatively about the scheme:

“I was originally very sceptical about the value of ISO 17025 but as I have found out more about it and moved towards implementation I have become convinced that it is the only way forwards.”

Some were skeptical about the usefulness of ISO 17025, but proposed alternatives:

“Whilst I agree with standardisation and use of scientific methodology, there appears to be a lack of understanding from the FSR’s stance that Digital Forensics is purely Lab based, where a single test can validate a method/process/tool. The attitude that a Digital Lab needs to be more akin to a standard Forensics Lab is flawed. Computers, Networking and IT are a product of an engineering environment, where multiple variables can influence outcomes, in subtle software differences different tools are required for different situations. Relying on a small number of validated tools can be too restrictive, and the regime for constant changing time consuming. With Digital Forensics you are not bring[ing] a discrete exhibit into a controlled environment. You are effectively dealing with a dynamic crime scene, and whilst you can control it to some extent, you need to be flexible in your approach and investigation, in this respect ISO 17025 is the wrong standard, whilst still not ideal ISO17020 would be more appropriate. There also needs to be more integration with existing ISO Cyber Security standards concerning Digital Forensics being developed ISO/IEC 27037:2012, which follow industry standards from NIST (who in respect of Digital forensics heavily reference ACPO principles). I would suggest most commercial providers will follow this standard as Cyber Security is commercially more lucrative than Criminal Justice work, and ISO27K will be the standard that potential customers will specify.”

“Strikes me that the software vendors should be tasked with ensuring their products conform, rather than umpteen labs testing the same piece of software which is prohibitively timeconsuming for small companies. Validation through the use of multiple tools and reviewing the data at a lower level should be the goal.”

Regarding pricing, most practitioners were not sure how much it had cost their organisation to achieve ISO 17025 accreditation, but 15.1% reported that it had cost £50,000 or more.

There was anecdotal evidence to support many respondents’ claims that ISO 17025 accreditation would be punitively expensive:

“My previous department unsuccessfully applied for accreditation. When considering the costs of salaries of dedicated quality managers, (which we didn’t used to have), the loss of time for each analyst in conducting validation and completing extra forms, I would say the process cost well in excess of £200,000. We did in fact cut two analyst positions to fund two quality managers. I am particularly concerned about cost. In that department almost the entire training budget was swallowed up by costs of accreditation and we also had to cut back on the number of licences we had for some tools where we really need to have one licence per examiner. As analysts we found our productivity dropped with most analysts doing about 70% of the number of cases per year that we used to complete. The only way to get that number of cases back up was to do a less thorough investigation than we used to do. I don’t believe small companies and ‘one man bands’ will be able to afford ISO17025 and this describes the majority of defence experts. What happens when a defendant can’t find an expert? Considering staged reporting means evidence may not have been validated by an analyst, I think there is a greater need than ever for good defence experts. I believe we will start to see miscarriages of justice.”

“ISO 17025 would cost more than my turnover and would be entirely irrelevant to my work. I would expect that it would give similar difficulties to other sole practitioners and those who specialise in working for the Defence. Not all work is laboratory work.”

“Extra staffing costs over 60k per year, training costs £360k, new equipment 100k.”

“With recent legal aid cuts and hours granted by LAA being strained, it is not in my view achievable for single practitioners who still have a passion to work within the criminal sector and work in multiple disciplines. We are already struggling to make a living out of it and if we don’t work on cases we don’t earn anything. Hence, it isn’t just the actual costs but also the large loss of income and research time. I work in the evaluation of phone content evidence, connection record analysis, cell site analysis, attribution evidence with respect to phones and then full computer forensics from a USB up to servers. I’d be constantly working on getting accredited in each of the areas and not actually working on cases. I also work in specialist cases at times testing software/systems/procedures to determine their functionality and/or interpret data. I doubt the LAA would grant me funds to write bespoke procedures for these cases – it would likely cost more than my actual work. I also sometimes turn up on a site without any prior knowledge of the system I will be testing.”

Respondents were then asked some questions about their own experience and the work they do.

Most had completed commercial courses or in-house training, with a significant proportion also having gone through a related university course.

The most popular tools used were EnCase, FTK and IEF. 56% used individual utilities written by other parties, and 29.7% used self-written utilities. When asked whether they had carried out validation or verification of their tools, some said they performed no validation. Dual tool verification was a popular option, as was hash checking, and some practitioners used a drive with known files to validate new tools before use on a case.

Several people mentioned that they felt it should not be down to practitioners to validate the tools, but the companies producing them. Some made the distinction that they validate their results, not their tools.

A couple of organisations had a dedicated member of staff whose role was to test and validate the tools being used in the lab.

One respondent pointed out a challenge of validating tools:

“Due to the proprietary nature of commercial tools, we have no access to source code and any testing is therefore black box. Due to the complexity of these tools and the unpredictable nature/infinite variations of digital source data, it is impossible to test every possible program path by black box testing. Whilst such lab tests may validate one aspect of the tool when using a specific test disk, a real world exhibit may contain a completely different configuration (e.g. operating system, hardware, installed applications, hardware/software corruption) which could result in incorrect output. In my opinion, caution should be applied to such tests as validation in the lab does not necessarily prove the tool will work correctly in all future use cases.”

When asked whether ISO 17025 could be applied to all forms of digital forensics within their practices, the majority replied that it could not. Some elaborated on this:

“Absolutely not. We deal regularly with volatile data on unpredictable systems, where our actions have to be chosen by risk assessment rather than by rote procedure.”

“I do not think the standard applies to all parts of digital forensics. Take the examination of mobile devices for example, they move at such a pace that trying to verify a complete tool to examine all types of phones is simply impossible. However, ISO is causing us to use a standard for the sake of it even though it has no hope of achieving what it is designed to.”

Most believed that ISO 17025 could not be used for event reconstruction, and the majority were also of the opinion that it neither could nor should be used in the context of expert opinion.

The survey then moved on to ask about CPR 19 and whether organisations were compliant with it.

Space was then provided for respondents to leave any further comments on the accreditation process. Some reported that their companies were closing down as a result of ISO 17025 being applied to digital forensics. Others reported changes in the way they conduct investigations:

“I came into forensics a year before ISO 17025 came into place and what I’ve seen is a real change of culture in our workplace, from what is supposed to be investigative forensics (finding one piece of evidence leading to another, like a chain reaction) into following ISO standards where we can no longer chase down the chain and instead stop where our standard operating procedures tell us to stop, it basically shackles us. There is a procedure in our ISO manual to say that we can do “infrequently used techniques” which would cater for this type of chasing, but you need to know what exactly it is you are aiming to obtain, plus it has to be approved by a bunch of working groups and management meaning by the time you actually want to do something like this, it becomes such a hassle that it slows the whole forensic process down.”

On the whole, the general sentiment about ISO 17025 was overwhelmingly negative, with very few people reporting that they thought it would be useful, and several declaring that it was making them think twice about continuing their careers in digital forensics.

“Working toward 17025 has made me want to leave computer forensic examinations. I believe it stifles innovation and makes examiners think twice about using methods that they know will take an age to validate and write up. I have been involved in forensics for nearly 10 years and I now want to leave.”

Some said they thought ISO 17025 was “a good idea, poorly executed”, and that something similar but more relevant to digital forensics would be a vast improvement.

You can find the full results of the survey here (PDF).

Leave a Comment

Latest Videos

Digital Forensics News Round Up, March 27 2024 #dfir #digitalforensics

Forensic Focus 27th March 2024 6:06 pm

Digital Forensics News Round-Up, March 21 2024 #digitalforensics #dfir

Forensic Focus 21st March 2024 6:15 pm

This error message is only visible to WordPress admins

Important: No API Key Entered.

Many features are not available without adding an API Key. Please go to the YouTube Feeds settings page to add an API key after following these instructions.

Latest Articles