Risk Management

Risk Assessment

Many people interchange hazard and risk on a daily basis. Unfortunately, they are actually two different concepts. The difference may not be as much as an issue for the everyday conversation, but when it comes to risk assessment and control, it is extremely important. Below you will gain a better understanding of the difference between the two and why the difference is so important.

The basic difference is that a hazard is something that will cause harm, while a risk is the possibility that a hazard may cause harm. Although they are used synonymously, knowing the difference could save your life or allow you to enjoy it more thoroughly.

In essence, a hazard will not be risky unless you are exposed to enough of it that it actually causes harm; the risk itself may actually be zero or it may be greatly reduced when precautions are taken around that hazard.

The simple relationship between the two is that you have to have exposure to a hazard to experience a risk. Thus, it is vital that you know the level of exposure you are going to have to the hazard to better understand how much risk is actually involved.

Risk Assessment Methods

There are a variety of risk assessment methods for the various categories. When it comes to the difference between hazard and risk, several categories may use different measurements and methods. As an example, the way risk is assessed in human health may be different from the risk assessment for project management.

Why Use a Risk Assessment Method?

A risk assessment is a tool used to determine the potential results from any given hazard. The assessment uses a combination of situational information, previous knowledge about the process, and judgments made from the knowledge and information.Since the risk is the potential damage done by a hazard, there are certain outcomes that any good risk assessment needs to have.

There are six main outcomes that are needed to have an effective risk assessment. By the end of the assessment you should know:

  • Any situations that may be hazardous
  • Which method is appropriate to use when determining the likelihood the hazard will occur
  • Alternative solutions for reducing and eliminating the risk or any negative consequences the may occur
  • More information for making a decision about risk management
  • Estimation for the uncertainly of the analysis

Steps of a Risk Assessment

Step 1: Discover the hazards. You can do this by using several different strategies such as walking around the area, navigating through portfolios and databases, or asking people who are around.

Step 2: Determine who may be harmed and how they may be harmed. After discovering the hazards you will need to determine who may be harmed by them, as well as how they may be harmed.

Step 3: Analyze the amount of risk and how you can control them. You may find that you can simply remove the hazard. If not, then decide which control method will be best to use to reduce the amount of risk.

Step 4: Document your assessment and results. It is important that you document what you find. This is done for legal reasons to protect you, the location, and any possible persons that may be involved. You also want to be sure that you write down your next plan of action – what control measures you are going to take.

Step 5: Regularly review and update your assessment. It is great to think that once the hazard is gone that all risks of harm are gone. This is not true. In some cases the hazard may return and in other new hazards may develop. Regularly checking will keep you and everyone around safe.

Risk Control Methods

Knowing the difference between hazard and risk leads to risk control. Risk is controlled when your business takes actions that help eliminate safety risks as much as you are able to do so. If it is not possible to completely eliminate the risk, controlling your risk may mean that you are taking actions to minimize the risks and hazards within the work environment.

There are four main methods that can be used to eliminate or minimize these risks – avoidance, loss prevention & reduction, transfer, and acceptance.

1. Avoidance

This is by far the easiest way to control any risk. When you decide to use this method, you find all possibly hazardous activities and stop them. It is important that you remember when choosing this option you may also miss out on other opportunities and gains.

2. Loss Prevention & Reduction

Using this method you will reduce the frequency and severity of a specific loss. You may decide to increase security measures or improve maintenance, or you may create rules that require your employees to wear certain safety gear.

3. Transfer

When you choose this method you will create a contract with a third party to deal with that risk. A couple great examples would be hiring a security company to improve security or hiring a cleaning crew to ensure health hazards are cleaned up.

4. Acceptance

This last method is not to be taken lightly. When you feel that transfer or loss prevention & reduction methods are not necessary or are too excessive, this may be the option for you. However, it is important that you understand this could possibly be dangerous for your company. Undergoing too many losses or enduring too many negative consequences can quickly sink your business.

Automated manufacturing Practice

Good Automated Manufacturing Practice for Pharmaceutical Industries

The Good Automated Manufacturing Practice (GAMP) Forum was founded in 1991 by pharmaceutical industry professionals in the United Kingdom to address the industry’s need to improve comprehension and evolving expectations of regulatory agencies in Europe. The organization also sought to promote understanding of how computer systems validation should be conducted in the pharmaceutical industry.

GAMP rapidly became influential throughout countries as the quality of its work was recognized internationally. Over time, GAMP has become the acknowledged expert body for addressing issues of computer system validation.

GAMP’s guidance approach defines a set of industry best practices to enable compliance to all current regulatory expectations. More than simply a strict compliance standard, GAMP is a guideline for life sciences companies to use for their own quality procedures. As a result, it can be tailored to a number of computer system types.

Computer system validation following GAMP guidelines requires users and suppliers to work together so that responsibilities regarding the validation process are understood. For users, GAMP provides a documented assurance that a system is appropriate for the intended use before it goes live. Suppliers can use GAMP to test for avoidable defects in the supplied system to ensure quality product leaves the facility.

The GAMP framework addresses how systems are validated and documented. Companies do not need to follow the same set of procedures and processes of a GAMP framework to achieve validation and qualification levels that satisfy inspectors. Instead, GAMP examines the systems development lifecycle of an automated system to identify issues of validation, compliance and documentation.

As a voluntary program, GAMP offers both challenges and benefits. The top three challenges in implementing GAMP are establishing procedural control, handling management and change control, and finding an acceptable standard among the existing variations.

Establishing procedural control is a challenge in using GAMP guidelines because new frameworks may be necessary to gauge the validity of systems. Most pharmaceutical companies have already established a baseline that adheres to standards and regulations that exist today, but they may not have a procedure to check the processes that are in place. This could cause resistance among software developers who may prefer not to work within the confines of specifications and procedures developed by others. Specifications and procedures developed by previous software developers may hinder ways to adjust computer systems, but varying interpretations of GAMP guidelines allow for multiple solutions.

Another hurdle is change control. In the development or modification of computer systems, companies with even the highest of standards can suffer setbacks along the systems development lifecycle. Sometimes minor tweaks by the software programmer may cause breakdowns after validation changes have been implemented. Internal processes and procedures must be established to guard against these occurrences.

Effective documentation management is fundamental for compliance. Any inaccuracies or missing information renders all other efforts moot. Moreover, implementing a formal document management application may be cost-prohibitive for some organizations. Some companies simply use what’s in the GAMP checklists to evaluate their systems. Today’s environment demands a thorough process to show validation.

The benefits of utilizing the GAMP approach for both users and suppliers include:

  • Improved understanding of the subject with the introduction of common terminology
  • Reduced cost and time to achieve compliant systems
  • Reduced time and resources for revalidation or regression testing and remediation
  • Reduced cost of qualification
  • Enhanced compliance with regulatory expectations
  • Established responsibility for all involved parties

When the FDA introduced its current Good Manufacturing Practices (cGMP) for the 21st century initiative, companies shifted their approach to validation. Formerly, they only had to heed a set of rules that accounted for every piece of equipment that was used. Now they can take a risk-based approach to validation by addressing safety, efficacy and quality in the product considerations. This enables the industry to place its investments where it makes the most sense. The onus ultimately falls on manufacturers to accept greater responsibility to validate their systems having the attendant benefits of cost and time to market savings.

GAMP helps provide a quality product from the manufacturer, and helps to limit the pharmaceutical industry’s culpability by ensuring proper steps were placed to deliver a quality product through validated systems. By incorporating input from the full spectrum of stakeholders, fine-tuning and further development of the process is geared towards benefiting the life sciences industry and the general consumer market.

The tools exist for companies to take the steps needed to reap the benefits of validation. Understanding an early adoption of GAMP can increase a company’s competitive position, especially with the implementation of new technologies. By staying aware of technological innovations, companies are able to increase efficiency, minimize risks and reduce costs.

Software Validation

Software Validation

Validation is a critical tool to assure the quality of computer system performance. Computer system software validation increases the reliability of systems, resulting in fewer errors and less risk to process and data integrity.
Computer system validation also reduces long term system and project costs by minimizing the cost of maintenance and rework.

Software Validation commences with a user requirement document (URS). URS is prepared to describe the critical functionalities those are required for our analysis. It is essential that the document is properly scoped in order that the procurement, installation, commissioning, validation, user training, maintenance, calibration and cleaning tasks are all investigated and defined adequately.

To scope and define an adequate validation procedure the URS has to be detailed sufficiently for various assessments to be made. The main assessment that concerns with qualification documentation is the risk assessment. This assessment is only concerned with ensuring that the degree of validation that is proposed; is compliant with the regulatory requirements.

So at this early stage it is required to execute a Validation Risk Assessment protocol against the end user’s requirements. This step is purely to ensure that the more obscure pieces of ancillary equipment and support services are fully understood and their requirement investigated, priced and included in the final issue of the URS; which will be sent out with the Request to Tender. This is an essential stage if the URS is to accurately define what depth and scope of validation is appropriate for the verification that the software will deliver all the requirement detailed in the URS.

The outcome of the Validation Risk Assessment (VRA) drives a split in software validation documentation scope, if the VRA categorizes the software validation as requiring Full Life Cycle Validation (FLCV); then a considerable amount of the software validation effort is put into establishing how the software originated, was designed and developed, in order to establish that its basic concept and development can be considered robust, sound and in accordance with best practices.

The original development plans; code reviews, methods reviews and testing plans must be available to enable this software validation documentation to be executed successfully. Once this proof of quality build is established, validation then follows a more convention path in inspections and verifications.

Software that is not classified as requiring FLCV treatment does not require this depth of verification into quality build history and is validated mainly by the more convention path in inspections and verifications.

Dynamic Testing

Dynamic testing verifies the execution flow of software, including decision paths, inputs, and outputs. Dynamic testing involves creating test cases, test vectors and oracles, and executing the software against these tests. The results are then compared with expected or known correct behavior of the software. Because the number of execution paths and conditions increases exponentially with the number of lines of code, testing for all possible execution traces and conditions for the software is impossible.

Static Analysis

Code inspections and testing can reduce coding errors; however, experience has shown that the process needs to be complemented with other methods. One such method is static analysis. This somewhat new method largely automates the software qualification process. The technique attempts to identify errors in the code, but does not necessarily prove their absence. Static analysis is used to identify potential and actual defects in source code.

Abstract Interpretation Verification

A code verification solution that includes abstract interpretation can be instrumental in assuring software safety and a good quality process. It is a sound verification process that enables the achievement of high integrity in embedded devices. Regulatory bodies such as the FDA and some segments of industry recognize the value of sound verification principles and are using tools based on these principles.

Validation

Validation Protocols for Pharmaceutical Industries

For pharmaceutical industries, product quality is paramount. Minor inconsistencies can lead to major disasters. To maintain quality assurance, consistency and risk assessment, industries conduct a validation of processes and equipment. A validation is a documented evidence of the consistency of processes and equipment. Design Qualification (DQ), Installation Qualification (IQ), Operational Qualification (OQ) and Performance Qualification (PQ) are an essential part of quality assurance through equipment validation.

DQ IQ OQ PQ protocols are ways of establishing that the equipment which is being used or installed will offer a high degree of quality assurance, so that manufacturing processes will consistently produce products that meet predetermined quality requirements.

Design Qualification (DQ)

Design qualification is a verification process on the design to meet particular requirements relating to the quality of manufacturing and pharmaceutical practices. It is important to take these procedures into consideration and follow them keenly. Along with Process Validation, pharmaceutical manufacturers must conduct Design Qualification during the initial stages. For DQ to be considered whole, other qualifications i.e. IQ, OQ and PQ need to be implemented on each instrument and the system as a whole.

DQ allows manufacturers to make corrections and changes reducing costs and avoiding delays. Changes made to a DQ should be documented which makes DQ on the finalized design easier and less prone to errors. By the use of a design validation protocol it is possible to determine whether the equipment or product will deliver its full functionality and conform to the requirements of the validation master plan.

Installation Qualification (IQ)

Any new equipment is first validated to check if it is capable of producing the desired results through Design Qualification, but its performance in a real-world scenario depends on the installation procedure that follows. Installation Qualification (IQ) verifies that the instrument or equipment being qualified, as well as its sub-systems and any ancillary systems, have been delivered, installed and configured in accordance with the manufacturer’s specifications or installation checklist. All procedures to do with maintenance, cleaning and calibration are drawn at the installation stage. It also details a list of all the continued Good Manufacturing Procedures (cGMP) requirements that are applicable in the installation qualification.

Conformance with cGMP’s requires, that whatever approach is used, it is fully documented in the individual Validation Plan. The IQ should not start with the Factory Acceptance Testing (FAT) or Commissioning tasks, but it should start before these tasks are completed; enabling the validation team to witness and document the final FAT and commissioning testing. The integration of these activities greatly reduces the costly and time consuming replication of unnecessary retesting.

These requirements must all be satisfied before the IQ can be completed and the qualification process is allowed to progress to the execution of the OQ.

Operational Qualification (OQ)

Operational Qualification is an essential process during the development of equipment required in the pharmaceutical industry. OQ is a series of tests which of tests which ensure the equipment and its sub-systems will operate within their specified limits consistently and dependably. Equipment may also be tested during OQ for qualities such as using an expected and acceptable amount of power or maintaining a certain temperature for a predetermined period of time. OQ follows a specific procedure to maintain thoroughness of the tests and accuracy of the results. The protocol must be detailed and easily replicated so that equipment can be tested multiple times using different testers. This ensures that the results are reliable and do not vary from tester to tester. OQ is an important step to develop safe and effective equipment.

Performance Qualification (PQ)

PQ is the final step in qualification processes for equipment, and this step involves verifying and documenting that the equipment is working reproducibly within a specified working range. Rather than testing each instrument individually, they are all tested together as part of a partial or overall process. Before the qualification begins, a detailed test plan is created, based on the process description.

Process Performance Qualification (PPQ) protocol is a vital part of process validation and qualification, which is used to ensure ongoing product quality by documenting performance over a period of time for a certain process.

Equipment qualification through DQ IQ OQ PQ practices is a part of Good Manufacturing Practice (GMP), through which manufacturers and laboratories can ensure that their equipment delivers consistent quality. It reduces the margin for errors, so the product quality can be maintained within industry standards or regulatory authority requirements. When qualification of equipment is not needed very frequently, performing it in-house might not be feasible, so smaller laboratories might benefit from scheduling external equipment validation services on a regular basis instead.

Pinch analysis & Process integration services

In a highly competitive global economy, maximizing results by eliminating costly inefficiencies is critical.

Chemical & Process Industries in India consume lots of Energy; the primary usage being Heating, Cooling, and Electrical Power.

Traditional attempts to reduce energy only focused on individual piece of equipments & units. Process Integration involves a streamlined & systematic “Total site -as a whole” approach towards use of Heating/Cooling Requirements, electrical power and water. Process Integration ensures maximum advantages by providing alternative designs and structured road maps for long term energy savings.

Though Process Integration covers a wide area, pinch analysis companies & process integration services companies focus mainly in Facilities where Distillation consumes major energy, and where lots of waste energy is available.

Reactions like Nitration, Oxidations and Hydrogenations evolve a lot of heat due to exothermic nature. Typically these reactions, distillation processes or other downstream operations consume considerable amount of heat.

Thus, the concept of having continuous processing of these reactions to utilize the heat evolved in downstream operation integrates the process that can result in significant savings and safe operations.

See how Panorama can help you migrate from batch reactions to continuous ones by piloting the whole process till commercial levels.

Up till now, Process Integration is in use at bigger plants, Panorama’s idea is to make it work in smaller plants as well.

Process Integration Methodology:

  • Generating heat & material balances by optimizing simulation services
  • Performing energy analysis to identify alternatives
  • Make modifications in processing conditions
  • Redesign equipments in the light of above changes.
  • Analyze cost data and finalize proposal.

Waste Heat Recovery Units (WHRU)

The exhaust gas of various processes or even from the exhaust stream of processes & conditioning units generates waste heat which can be used to generate useful heat and reduce fuel consumption.

There are many different commercial recovery units for the transferring of energy from hot medium to cold one:

  • Recuperators
  • Regenerators
  • Heat pipe exchanger
  • Economizers
  • Heat pumps etc

Combined Heat and Power (CHP)

Waste heat of different degrees can be found in final products of certain process or as a by-product in industries like steelmaking plants. Units or devices that could recover the waste heat and transform it into electricity are called CHPs. Such units, for example, use an Organic Rankin Cycle (ORC) with an organic fluid as the working fluid. The fluid has a lower boiling point than water to allow it to boil at low temperature, to for a superheated gas that could drive the blade of a turbine and thus a generator.

Panorama’s offerings

  • Process Integration solutions
  • Energy analysis solutions
  • Batch to Continuous Processing Migration Solutions
  • Waste Heat Recovery -Simple Solutions involving heat recoveries only.
  • Waste Heat Recovery- Comprehensive solutions including CHP (Combined Heat Power)

Panorama maximizes energy efficiency in both initial design and ongoing operations through best practices pinch analysis. Thermal Pinch Analysis & Process Integration is a systematic methodology for determining optimal energy efficiency. In most cases we find millions of dollars of annual operational savings (in facilities more than 15 years old). For Greenfield development we enable process engineers to properly integrate pinch tools into the conceptual process design phase, on which the foundation of the entire lifecycle is built.

For existing facilities, Panorama provides the most effective action plan by understanding complete energy balance for maximum efficiency, identifying suboptimal energy exchange between process streams, and analyzing the most cost effective balance between energy savings and capital expenditure to achieve optimal efficiency.

On short notice, our experts can pull together engineering scopes and feasibility studies to produce ready-to-use initial design work.

Our onsite dedicated teams are available to directly interact with the process specialists, should onsite be preferred to remotely analyzing. Regardless of geography, we use the highest thermodynamic principles to systematically analyze chemical processes and surrounding utility systems.

Panorama’s pinch analysis results in financial savings through better process heat integration.

Improved Energy Consumption Via Heat Integration & Pinch Analysis

A respected tool for achieving energy efficiency is process heat integration with pinch analysis. This article presents an overview on pinch analysis and its mode of employment in operation and process design to achieve energy efficiency gains in real-world. The Heat integration comprises of several techniques that assist engineers to properly evaluate entire sites and processes instead of focusing on individual operations.

This includes knowledge-based systems, hierarchical design methods, Pinch analysis, numerical and graphical techniques. Pinch methods dominate in the area of energy efficiency. The terms heat integration (PI) and pinch analysis are frequently used interchangeably.

Pinch analysis which is also known as process integration, energy integration, heat integration or pinch technology is employed in achieving minimal energy consumption by optimizing energy supply methods, process operation conditions and heat recovery systems. It is a methodology for minimizing the consumption of energy through chemical processes by targeting feasible energy targets thermodynamically.

As a systematic technique for analyzing the flow of heat through an industrial process, pinch analysis’ process data is represented as a set of streams or energy flows. Naturally, heat is required to flow from hot to cold objects in the Second Law of Thermodynamics. This is a major concept that represents the overall heat demand and heat release of a process as a function of temperature.

For the identifications of the Pinch and targets for cold and hot utilities, the Problem Table algorithm is the tool to use. It is a fundamental computational tool. The location where the heat recovery is the most constraint is designated by the Pinch which is characterized by ΔTmin (a minimum temperature difference between hot and cold streams).

As a result, the system can be divided into two separate subsystems that are located above and below the Pinch respectively. Hot utility is only required above the Pinch while cold utility is required below the Pinch. So far, the identified hot and cold utility consumption turns out to be Energy Requirements (MER). Once a heat transfer (cross-pinch) is present, no design can achieve MER.

Redundancy many be introduced by the separation of the original problem in the number of heat exchangers. In order to reduce the number of units, the removal of the Pinch constraint may be necessary especially when the capital cost is high. An optimized cost of operation against the reduction in capital costs will be cleared by extra energetic consumption.

As a result, heat recovery problem will become an optimization of both capital and energy costs which is restricted by a minimum approach in temperature when designing the heat exchangers. For effective heat integration, there is need for data extraction and stream selection in Pinch Analysis. Constant CP is the major computational assumption in Pinch Analysis.

Effluent Minimization Strategies for Waste Minimization and Cost Reduction

Waste minimization is essential for every industry that manufactures products and incurs cost. In India alone, there are a large number of manufacturers producing simple products such as plastic and this is often subject to the question of waste. It is known that the minimization of waste is the maximization of profit. The consumption of earth’s natural resources is seen as one of the major environmental problems we face in the world today and industrial waste and emissions can have drastic effects both financially and environmental. More so, issues such as global warming and ozone depletion are factors that emanate from local manufacturers. In India alone, the amount of emissions caused by manufacturers is alarming and businesses need to come up with new ways for waste minimization.

Waste minimization & Cost reduction strategies

Waste minimization has become one of the business regulations for businesses in India and thousands or manufacturers have been induced to employ waste reduction programs. However, very few people truly understand the cost that wastes can have on their own businesses or just how much it is costing the environment. It is therefore noted that waste reduction is a tool for creating a better world with more competitive industries.

When looking at waste minimization, there are three main proponents that can be drivers of this new world. These are: people, systems and technology.

  • People: Changing a notion or culture can only be implemented if it is first targeted at people. People influence systems and systems influence technology. People should be educated on waste minimization and cost reduction. They should be enlightened on the fact that the littlest raw material saved in production processes can have multiple uses and benefits and should therefore not be wasted. If the whole of India starts to see waste minimization differently, it may just have a greater effect globally.
  • Systems: Also, a systematic approach should be geared towards measurement and controlling problems that occur with eyes set on maintaining efficiency levels. Apart from the obvious benefits of waste minimization, there are also cost implications. Businesses should therefore put new systems in place to ensure that people are producing efficiently.
  • Technology: Lastly, technology could be a major driver of this new world system. Capital investment should be introduced to improve manufacturing productivity and reduce waste creation. Technology has a major deciding role on the world we live in today. Therefore, technology should be motivating waste minimization and helping reduce cost.

A number of companies have also developed strategies to ensure that there are reduced amounts of waste in manufacturing processes. This is because there is a greater enlightenment on the fact that raw materials can be used for several production processes with even by-products having relevance in the production of commodities. Companies are advised to perform studies of the true cost of waste and should create new strategies for the management of this problem.

Waste minimization and cost reduction should be at the forefront of thoughts for people and businesses in India. It would provide a greater and more efficient world we live in and also reduce the costs of doing business.