Category "Simulation"

I mentioned the process automation concept of ISight in a previous simulation automation blog. ISight is an open source code simulation automation and parametric optimization tool to create workflows that automate the repetitive process of model update and job submission with certain objectives associated with it. The objective could be achievement of an optimal design through any of the available techniques in ISight: Design of experiments, optimization, Monte Carlo simulation or Six Sigma. In this blog post, I will be discussing various value added algorithms in DOE technique; I will discuss other techniques in future blogs.

Why design of experiments

Real life engineering models are associated with multiple design variables and with multiple responses. There are two ways to evaluate the effect of change in design variable on response: Vary one at a time (VOAT) approach or Design of experiments (DOE) approach. The VOAT approach is not viable because:

  • This approach ignores interactions among design variables, averaged and non-linear effects.
  • In models associated with large FE entities, each iteration is very expensive. VOAT does not offer the option of creating high fidelity models with a manageable number of iterations.

With the DOE approach, user can study the design space efficiently, can manage multi dimension design space and can select design points intelligently vs. manual guessing. The objective of any DOE technique is to generate an experimental matrix using formal proven methods. The matrix explores design space and each technique creates a design matrix differently. There are multiple techniques which will be discussed shortly and they are classified into two broad configurations:

  • Configuration 1: User defines the number of levels and their values for each design variable. The chosen technique and number of variables determines number of experiments.
  • Configuration 2: User defines the number of experiments and design variables range.

Box-Behnken Technique

This is a three level factorial design consisting of orthogonal blocks that excludes extreme points. Box-Behnken designs are typically used to estimate the coefficients of a second-degree polynomial. The designs either meet, or approximately meet, the criterion of rotatability. Since Box-Behnken designs do not include any extreme (corner) point, these designs are particularly useful in cases where the corner points are either numerically unstable or infeasible. Box-Behnken designs are available only for three to twenty-one factors.untitled

Central Composite Design Technique […]

For many years, finite element modeling has been the job of a specialist; the tools used to perform even simple finite element analysis have been complex enough to require a subject matter expert. This is primarily due to the complex, difficult to understand graphical user interfaces of these products. The job is made further difficult to perform due to the requirement of advanced engineering subject knowledge by the analyst.

Can a mechanical designer who uses CAD tools to create engineering drawings be trained to perform engineering simulations?

In today’s product availability scenario, the answer is “yes.”

A CAD designer using CATIA can create and execute simple finite element models within the CATIA environment by using CATIA workbenches that have been developed for simulations. This makes it intuitive and easier for designers to ensure that their parts meet their design requirements.

untitled

How the simulation methodology gets simplified using designer level tools

  • No need of an expert level analyst tool to perform simple finite element simulation.
  • No need of manual data transfer between design and analysis departments.
  • No need of geometry clean up tools to fix data translation errors.

There are obvious benefits to adopting this simplified approach that integrates the design and analysis environments. The designer can predict design problem early in design process; subsequently the designer can check various alternatives of design in less time. This is primarily due to the tight integration of designer level tools with knowledge based engineering that allows the designer to deliver better product in less time.

Part Level Simulation

From a geometrical perspective, the simulation model can be generated at part level to begin with. The native integration within CATIA allows users to perform stress, displacement, and vibration analysis at any time in the design process, allowing more accurate sizing of parts and fewer design iterations. Individual parts consisting of solid, surface, and wireframe geometries can be analyzed under a variety of loading conditions. The analysis specifications, such as loads and restraints, are associative, with the design allowing users to perform analyses quickly and easily. These specifications are then automatically incorporated into the underlying finite element model, meaning that users do not have to work directly with the finite element model. “Virtual parts” allow items like forces, moments, and restraints to be easily modeled without having to have a detailed geometric representation.

Standard reports can be automatically generated in HTML format, providing clear and detailed information about the results of the analysis, including images associated with the computations. These reports can be used to document the analyses that have been performed and to communicate the results of the analysis to other stakeholders in the organization. CATIA V5 Analysis users benefit naturally from the overall PLM solution provided by Dassault Systèmes, including ENOVIA V5 for data and product lifecycle management. CATIA V5 Analysis users can store, manage, and version all the data associated with their product’s simulation and share the information within the extended enterprise. This unique capability allows collaboration and provides access to advanced PLM practices such as concurrent engineering and change management.

untitled

     Assembly level simulation

 If the concept of virtual parts does not hold good anymore and the complexities of various parts interacting with each other make assembly level simulation mandatory, it is possible to create analysis models for assemblies as well. The analysis of assemblies, including an accurate representation of the way the parts interact and are connected, allows for more realistic and accurate simulation. The designer does not have to make simplifying assumptions about the loading and restraints acting on an individual part. Instead the part can be analyzed within the environment that it operates with the loading automatically determined based on the way the part is connected to and interacts with surrounding parts.

The various types of connections that can be modeled include bolted connections, welded connections, pressure fitting connections, and many more. To make the job further easier for the designer, these connections can be defined using assembly level constraints that already exist in the CAT Product model. Once the design changes, the associated assembly constraints as well as corresponding FEA connections get updated, thereby creating an updated FEA model that is ready for analysis.

         Concurrent engineering made easier 

The “assembly of analysis” capability enables concurrent engineering. For example, the various parts in an assembly can be modeled and meshed separately by different users. They can either use the CATIA V5 meshing tools or import orphan meshes (meshes that don’t have any geometry associated with them) developed outside of CATIA Analysis using a variety of different modeling tools. The user responsible for analyzing the assembly can consolidate the different meshes, connect the parts, apply the loading specifications, and run the simulation. This can significantly reduce the turnaround time when analyzing large assemblies, particularly since some of the parts may have already been analyzed and therefore, the analysis models would already be available.

untitled

Extended solver capabilities

The basic level FEA solver present in the CATIA designer workbench is called the “Elfini” solver and can model only simpler physical problems such as linear materials, small deformations, small rotations and bonded contacts; real life problems can be much more complex and may necessitate the need of an advanced solver. To address such scenarios it is possible to include the well known non-linear solver Abaqus into the CATIA designer environment; it can model the effects of geometric nonlinearity, such as large displacements, and allows nonlinear materials to be included, such as the yielding of metals and nonlinear elastic materials like rubber. It also offers more advanced contact capabilities including the ability to model large relative sliding of surfaces in contact.

The Abaqus capability enables the effect of multiple steps to be analyzed, where the loading, restraints, contact conditions, etc., vary from one step to the next. This powerful technique allows complex loading sequences to be modeled. For example, a pressure vessel might be subjected to an initial bolt tightening step, followed by internal pressurization, and conclude with thermal loading.

untitled

 

untitled

Composites always had a well-defined place in the aerospace industry because of their properties: lightweight to make overall design lighter and toughness to make overall design bear the aero structural loads. At present, from aircraft fairing to train noses, boat hulls and wind turbines, composites offer dramatic opportunities to meet increasing cost-driven market requirements and environmental concerns. However, modeling of composites in a seamless collaborative environment has always been a challenge. This is because of multiple aspects of composites modeling such as design, simulation, and manufacturing that made it quite a tough task on a single platform.

CATIA composites workbench now offers a solution to address various aspects of composites modeling in a unified manner. The objective of this blog post is to provide information on composites workbench capabilities with respect to design, simulation, and manufacturability of composites.

DESIGN IN ANALYSIS CONTEXT

There are different ways to start the preliminary design of a composite part, but the zone-based design is ideal to capture analysis constraints and predict the behavior of the part inside the design environment by importing thickness laws. The thickness laws are calculated as a result of FEA analysis. The composites part design workbench in CATIA provides easy-to-use dedicated zone creation and modification features. Zone-based modeling contributes to significant time savings with the ability to perform concurrent engineering with mating parts. The image below shows a wing panel with a grid created from ribs and spars in assembly context and thickness law for each cell mapped on the grid from a spreadsheet.

untitled

Once the grid information is ready, Composites workbench provides highly productive automatic ply generation from zone capabilities with automatic management of the ply staggering and stacking rules. The ability to quickly and automatically transition from zones to plies while keeping full associativity, allows the designer to focus on the design intent and helps dramatically reduce the number of geometrical tasks required to design the part.

untitled

To further check the viability of a design from the structural strength perspective, it is possible to perform the FEA simulation within the CATIA environment using the Elifini solver of CATIA analysis. The full associativity with composites workbench is maintained and true fiber angles are taken into account. To address the non-linear aspect of FEA, is it possible to export the plies data in the form of layup files to Abaqus CAE using the composites fiber modeler plug-in. In case design modifications are needed, it is possible to edit and modify any ply or sequence in the composites workbench and instantly export the modified layup file to simulation workbench or Abaqus CAE for validation. Thus designers and analysts can work together in collaboration during the composites development process, saving time, improving product quality, and preventing costly error. […]

The Dassault Systèmes SIMULIA portfolio releases new versions of its software products every year, and this year is no different. The first release of Abaqus 2017 is now available for download at the media download portal. SIMULIA has developed and broadcast 2017 release webinars to make users aware of new features available in the 2017 release, but those webinars are long recordings ranging from one to two hours each, which can be daunting. This blog post will provide a brief highlight of materials and explicit updates in Abaqus solver 2017. A more detailed explanation of any mentioned update, or answers to further questions, can be obtained either by listening to the webinar recordings at the SIMULIA 3DExperience user community portal, leaving a comment on this post, or contacting us.

SPH boundary conditions improvements

SPH particles located on opposite sides of a surface cannot interact with each other in the absence of boundary condition. This was not the case in previous releases; in Abaqus 2017, this is the default boundary condition setting. There are further improvements in tensile instability control to prevent instability among particles subjected to local tensile stresses. Below is an example in which there are SPH particles in two different chambers; the lower chamber particles are subjected to displacement BC while upper chamber particles are not subjected to any BC.

DEM improvements

  1. The series and parallel search algorithms for contact are unified to improve the DEM performance. The search cells are created only once.
  2. It is now possible to run DEM jobs with particle generators in parallel mode. This means more than one particle generator can be active while a DEM job is running.
  3. In previous releases, only fixed time increment scheme was available and it was difficult for the user to predict the appropriate time increment. In the 2017 release, an automatic time increment scheme has been introduced.
  4. Adhesive particle mixing is now supported. The algorithm used is called JKR adhesive inter particle contact. Both Hertz contact and friction are supported.

Material Enhancements

  1. There is some good news for users in the health care industry who design and manufacture cardiovascular stents: Super-elasticity, which was previously a part of user subroutines, is now available in the Abaqus 2017 material library. The motivation is Nitinol, a nickel titanium alloy used in cardiovascular stents because of super elasticity, shape memory effect, biocompatibility, and fatigue. The Nitinol model exhibits linear elastic Austensite behavior at lower stresses. On further loading, transformation from Austensite to Martensite occurs but behavior is still linear elastic. Beyond full transformation, Martensite exhibits elastic plastic behavior. A similar phenomenon is observed in compression loading. It is supported in Abaqus CAE.

 

 

 

 

 

 

 

 

 

 

2. A multilinear kinematic hardening model is now available in Abaqus 2017. In previous releases, this model was available as a user subroutine material called ABQ_MULTILIN_KINHARD.  Plasticity follows an array of perfectly plastic subvolumes that follow Von-Mises criteria, each with a unique yield strength. This model offers more flexibility than the linear kinematic hardening model. It is available only in Abaqus standard and intended for thermo-mechanical fatigue of metals. It is supported in Abaqus CAE.

3. The definition of damage initiation and damage evolution of cohesive elements with traction separation response has been enhanced to include rate dependent cohesive behavior. It is available only in Abaqus explicit. Non-linear damage initiation of ductile metals is now supported in Abaqus 2017. This model provides more flexibility to predict damage under arbitrary loading paths. It is available both in Abaqus standard as well as in explicit for ductile, shear and Johnson Cook material models.

 

4. Non-linear damage initiation of ductile metals is now supported in Abaqus 2017. This model provides more flexibility to predict damage under arbitrary loading paths. It is available both in Abaqus standard and explicit for ductile, shear and Johnson Cook material models.

5. The parallel rheological framework model now supports plane stress elements as well, in both standard as well as in explicit.

6. A new subroutine for user defined thermal expansion coefficients has been introduced. It is called VUEXPAN. This routine can be used in explicit to define thermal strain increments as a function of temperature, time, element number, state, or field variable. It is available only with Mises plasticity, Hill Plasticity and Johnson Cook model.

Usability Enhancements

1.Enhancements in distortion control: In Abaqus explicit, it is possible to convert highly compressed solid elements to linear kinematic formulation. Once that happens, the analysis does not stop even if the elements get inverted. It is activated by default when solid elements are used with crushable foam material.

2. Larger stable time increments in Abaqus explicit: In Abaqus 2017, there is an improved estimate method of element characteristic length to get larger stable time increments. It is defined in explicit step as follows:

*Dynamic, Explicit, improved DT method=YES (by default) or NO

It is further possible to invoke this method selectively in individual sets instead of global model as follows

*section control, improved DT method = YES or NO

 

The Dassault Systèmes SIMULIA portfolio releases new versions of its software products every year, and this year is no different. The first release of Abaqus 2017 is now available for download at the media download portal. SIMULIA has developed and broadcast 2017 release webinars to make users aware of new features available in the 2017 release, but those webinars are long recordings ranging from one to two hours each, which can be daunting. This blog post will provide a brief highlight of standard and explicit updates in the Abaqus 2017 Solver. A more detailed explanation of any mentioned update, or answers to further questions, can be obtained either by listening to the webinar recordings at the SIMULIA 3DExperience user community portal, leaving a comment on this post, or contacting us.

Updates in Abaqus Standard

Abaqus Standard 2017 has been substantially improved with respect to contact formulations. Mentioned below are the key highlights of various contact functionalities improvements.

  • Edge to surface contact has been enhanced with beams as master definition. This new approach facilitates the phenomenon of twist in beams during frictional contact.
  • Cohesive behavior in general contact.

General contact has always been useful in situations where either it becomes cumbersome to visualize and define large number of contact pairs, even by using contact wizard, or it’s not possible to predict contact interactions based on initial configuration. The general contact support now includes cohesive behavior, thereby making it possible to define contact in situations shown in figure below.Image1

 

Cohesive contact does not constrain rotational degree of freedoms. These DOFs should be constrained separately to avoid pivot ratio errors.

There have been few other changes in cohesive contact interactions. In the 2016 release, only first time cohesive contact was allowed by default, i.e. either a closed cohesive behavior at initial contact or an open initial contact that could convert to a close cohesive contact only once. In the 2017 release, only a closed initial contact could maintain a cohesive behavior by default settings. Any open contact cannot be converted to cohesive contact later. However, it is possible to change the default settings.

Image1

 

  • Linear complementary problem

A new step feature has been defined to remove some limitations of perturbation step. In earlier releases, it was not possible to define a contact in perturbation step that changes its status from open to close or vice versa. In 2017 release, an LCP type technique has been introduced in perturbation step to define frictionless, small sliding contact that could change its contact status. No other forms of non-linearity can be supported in perturbation steps.  LCP is available only for static problems. Any dynamic step is not supported.

Image1

Updates in Abaqus XFEM (crack modeling) […]

The Dassault Systèmes SIMULIA portfolio releases new versions of its software products every year, and this year is no different. The first release of Abaqus 2017 is now available for download at the media download portal. In this blog post, I provide a brief highlight of updates in Abaqus CAE 2017. A more detailed explanation of any mentioned update, or answers to further questions, can be obtained either by listening to the webinar recordings at SIMULIA 3D Experience user community portal, leaving a comment on this post, or contacting us.

  • Consistency check for missing sections

Abaqus CAE users would probably agree that this mistake happens quite often, even though the parts with defined section assignments are displayed in a separate color. In previous releases, this check was not included in data check runs, so it was not possible to detect this error unless a full run was executed. In the 2017 release, missing regions can be identified in a data check run, thus saving time by eliminating fatal error runs.

image1

 

  • New set and surface queries in query toolset

The sets and surfaces can be created at part as well as assembly level. In earlier releases, it was not possible to see the content of a set or surface in the form of text, though it was possible to visualize the content in the viewport. In the 2017 release, query toolbox includes set and surface definition options. In case of sets, information about geometry, nodes, and elements can be obtained with respect to label, type, connectivity and association with part or instance, whichever is applicable. In case of surfaces, name, type, and association with instances, constraints, or interactions can be obtained.

image1

 

  • Geometry face normal query

In the 2017 release, it is possible to visualize the normal of the face or surface by picking it in the viewport. In case of planar faces, normal is displayed instantly. In case of curved faces, CAE prompts the user to pick a point location on the face by various options.

image1

[…]

417px-the_tortoise_and_the_hare_-_project_gutenberg_etext_19994In a race, the quickest runner can never overtake the slowest, since the pursuer must first reach the point whence the pursued started, so that the slower must always hold a lead.

— Aristotle, Physics VI:9, 239b15

This paradox, as first developed by Zeno, and later retold by Aristotle, shows us that mathematical theory can be disproved by taking the hypothesis to an absurd conclusion.  To look at it another way, consider this joke:

A mathematician and scientist are trapped in a burning room.

The mathematician says “We’re doomed! First we have to cover half the distance between where we are and the door, then half the distance that remains, then half of that distance, and so on. The series is infinite.  There’ll always be some finite distance between us and the door.”

The engineer starts to run and says “Well, I figure I can get close enough for all practical purposes.”

The principle here, as it relates to simulation like FEA, is that every incremental step taken in the simulation process gets us closer to our ultimate goal of understanding the exact behavior of the model under a given set of circumstances. However, there is a limit at which we have diminishing returns and a physical prototype must be built. This evolution of simulating our designs has saved a lot of money for manufacturers who, in the past, would have had to build numerous, iterative physical prototypes. This evolution of FEA reminds me of…

2000px-mori_uncanny_valley-svgThe uncanny valley is the idea that as a human representation (robot, wax figures, animations, 3D models, etc.) increases in human likeness, the more affinity people will have towards the representation. That is, however, until a certain point.  Once this threshold is crossed, our affinity for it drops off to the point of revulsion, as in the case of zombies, or the “intermediate human-likeness” prosthetic hands.  However, as the realism continues to increase, the affinity will, in turn, start to rise.

Personally, I find this fascinating – that a trend moving through time can abruptly change direction, and then, for some strange reason, the trend reverts to its original direction. Why does this happen? There are myriad speculations as to why in the Wikipedia page that I’ll encourage the reader to peruse at leisure.

elmer-pump-heatequationBut to tie this back to FEA, think of the beginning of the Uncanny Valley curve as the start of computer assisted design simulation. The horizontal axis is time, vertical axis is accuracy.  I posit that over time, as simulating software has improved, the accuracy of our simulations has also increased. As time has gone on, the ease of use has also improved, allowing non-doctorate holders to utilize simulation as part of their design process.

And this is where we see the uncanny valley; as good as the software is, there comes a point, if you use specialized, intricate, or non-standard analysis, where the accuracy of the software falters. This tells us that there will still be needs for those PhDs, and once they get on the design and start using the software, we see the accuracy go up exponentially.

If you need help getting to the door, or navigating the valley, talk to us about our Simulation benchmark process. Leave a comment or click to contact us.

 

In the years to come, fuel efficiency and reduced emissions will be key factors in determining success within the transportation & mobility industry. Fuel economy is often directly associated with the overall weight of the vehicle. Composite materials have been widely used in the aerospace industry for many years to achieve the objectives of light weight and better performance at the same time.

The transportation & mobility industry has been following the same trends, and it is not uncommon to see the application of composites in this industry sector nowadays; however, unlike the aerospace industry, wide application of composites instead of metals is not feasible in the automotive industry. Hence, apart from material replacement, other novice methods to design and manufacture lightweight structures without compromise in performance will find greater utilization in this segment. In this blog post, I will discuss the application of TOSCA, a finite element based optimization technology.

The lightweight design optimization using virtual product development approach is a two-step process: concept design followed by improved design.

Design concept: The product development costs are mainly determined in the early concept phase. The automatic generation of optimized design proposals will reduce the number of product development cycles and the number of physical prototypes; quality is increased and development costs are significantly reduced. All you need is the definition of the maximum allowed design space – Tosca helps you to find the lightest design that fits and considers all system requirements. The technology associated with the concept design phase is called topology optimization that considers all design variables and functional constraints in optimization cycle while chasing the minimum weight objective function. The technique is iterative that often converges to a best optimal design.

HOW IT WORKS

The user starts with an initial design by defining design space, design responses, and objective function. Design space is the region from where material removal is allowed in incremental steps and objective function is often the overall weight of the component that has to be optimized. With each incremental removal of material, the performance of the component changes. Hence each increment of Tosca is followed by a finite element analysis to check existing performance against target performance. If target performance criteria is satisfied, the updated design increment is acceptable and TOSCA proceeds to the next increment. This process of incremental material removal is continued until the objective function is satisfied or no further design improvement is feasible. The image below depicts a complete CAD to CAD process flow in Tosca. The intermediate processes include TOSCA pre-processing, TOSCA and a finite element code based co-simulation and TOSCA post processing.

Tosca workflow

During the material removal process, TOSCA may be asked to perform the optimization that provides a feasible solution not only from a design perspective but from a manufacturing perspective as well. For example, TOSCA may be asked to recommend only those design variations that can be manufactured using casting and stamping processes. This is possible by defining one or more of manufacturing constraints available in TOSCA constraints library.

manufacturing constraints

While the topology optimization is applicable only on solid structures, it does not mean TOSCA cannot perform optimization on sheet metal parts. The sizing optimization module of TOSCA allows users to define thickness of sheet metal parts as design variables with a lower bound and an upper bound. […]

Predictive Engineering Analytics is a must in current product design and is required to integrate all the multi disciplinary inputs present in today’s complicated products. In the words of Siemens:

“Predictive Engineering Analytics (PEA) is the application of multi-discipline simulation and test, combined with intelligent reporting and data analytics, to develop digital twins that can predict the behavior of products across all performance attributes, throughout the product lifecycle.”

In the above quote, the concept of a digital twin is important; this is the goal of having a complete digital representation of a physical product throughout its design, manufacturing and in-service lifecycle. Such a digital twin can accurately predict all behaviors of the physical product.

There are five key ways that Simcenter(TM) helps achieve a digital twin.

  1. Simcenter(TM) has an integrated Engineering Desktop environment, which allows all pre- and post-processing operations to be carried out. This environment has best-in-class geometry editing tools, comprehensive meshing, and an ability to associate the analysis model to design data.
  2. The environment is completely extendable and can be scaled from simple to complex problems. The benefits include a common user interface and the capability to create automated routines for common simulation problems.
  3. Simcenter(TM) can be linked into other integrated products and engineering systems. This enables simulation data management and allows links to be established to test data, 1D simulations, and 3D CAD. Engineers now have confidence that the behaviors predicted by the digital twin correlate with real life.
  4. The solution produces a common environment across all engineering departments. This allows knowledge to be captured, automated, and then used by a broader team. Specific solutions and best practices become entrenched in the organization, allowing for more consistent design processes. Training requirements are also reduced.
  5. Simcenter(TM) leverages the extensive industry knowledge and capabilities of the Siemens PLM broader portfolio.

If we look at the specific functions that Simcenter(TM) can cover, here is a quote and graphic from a Siemens presentation:

picture2picture1picture4

Another unique feature of the Simcenter(TM) solution is its open platform capability. The solution can be used as the primary pre- and postprocessor for Siemens PLM Solvers, NX Nastran and LMS Samcef, or for popular third party solvers, such as Abaqus, ANSYS, LS-DYNA, and MSC Nastran. This is illustrated in another graphic from a Siemens presentation: […]

How many times has the first design iteration submitted to FEA modeling passed the design criteria?

The answer is close to zero, but even if it does happen by stroke of fortune, the design is not the optimal design – which means that although design requirements are met and validated by FEA, there is always scope of improvement either in terms of cost or in terms of performance. In general, it is not unusual to reach the optimal design in 15 to 20 iterations.

An analyst know the pain of creating a detailed finite element simulation model. Most of the steps involved, such as geometry cleaning and meshing, are very time-consuming, and they are primarily driven by geometry. Let’s look at the workflow in more detail:

An analyst in automotive industry often performs finite element modeling work in Hypermesh, stress analysis in Abaqus, optimization in Optistruct, and durability in Fe-Safe or N-code. An analyst in the aerospace industry often performs CAD composites work in CATIA, finite element modeling in Abaqus CAE, stress analysis in Abaqus or Nastran, and durability in Fe-Safe. An analyst working in other industries has his own suite of FEA tools to work with. The entire process requires data flow from one simulation code to the other. This means output from one code serves as an input to the other. Quite often this work is also done manually by the analyst.

This means that in situations where optimal design is obtained in 20 iterations as mentioned above, an analyst has to perform geometry cleaning 20 times, create FE meshes manually 20 times, and also transfer the simulation data from one piece of code to the other 20 times. By the time these design iterations are over, the analyst’s face and computer looks somewhat like this:

Let analysts remain as analysts and let simulation robot do the rest!

The traditional job of finite element analyst is to build robust high fidelity simulation models that gives correct results under real life load applications. The analyst is not an FE robot who can perform repetitive tasks with ease. In situations like one mentioned above, it makes perfect sense to let FE analyst create a robust FE model only once per FE code involved. Subsequently introduce a simulation robot that can capture hidden steps and workflow, create a script and execute that script multiple times. This simulation robot is called ISight. […]

© Tata Technologies 2009-2015. All rights reserved.