In this era of new levels of globalization, product companies are faced with market pressures from global competition and price deflation. Today they seek alternate sources of profitable revenue growth enabled by value-add service products. Developing a service-based revenue stream and then delivering product service that is both effective and profitable has its own challenges, however. Even mature service organizations are seeking new approaches to reach a significantly higher quality of service delivery.

Today in a typical product company, there is no single application to manage the the data and the decision points required to deliver effective service. Multiple enterprise applications including PLM, ERP, and often a combination of local databases, spreadsheets and stand alone IT systems are involved in service management. This results in fragmented information and knowledge processes around service delivery.

A new approach centered on incorporating service lifecycle management (SLM) as an integral part of product lifecycle management (PLM) is required in order to to achieve significant improvement in service readiness and delivery. First, this approach focuses on making complex products easier and less costly to maintain, and allowing for more effective allocation of service resources. The second key component is managing the complexity of service information that will reduce the cost and time to create and deliver critical service documentation and service records, at the same time improving the quality and efficacy of this information.

With SLM approached as an extended PLM process, design information can be used to bootstrap and enhance service planning, and product changes and updates are directed to modify service work instructions, and field experience provides up-to-date insight into product quality. The bulk of the information required for services such as illustrations, schematics, and work instructions already exists within the engineering organization and can be repurposed with a relatively little effort. 3D CAD models and Bills of Materials can be used to create everything from exploded wireframe views to photorealistic rendering, and to remove and replace animations that help in service execution. Manufacturability and ergonomic simulations can be used to improve the safety and efficiency of repair procedures.

The expanded PLM system needs to act as a centralized repository of the service bill-of-materials (sBoM) along with Engineering & Manufacturing BoM so that service items, which are mostly design and manufacturing items repurposed for service, can be synchronized to reflect the most up-to-date state of information. This synchronization is possible when SLM is part of PLM and shares the same configuration and change management processes

This way, enterprise PLM systems become the digital backbone of the entire product life cycle – including  SLM – and SLM becomes a dynamic process connected with PLM that continues throughout the useful life of the product or asset. This reduces process fragmentation and provides rich end-to end context for better and more profitable service.

The combined PLM and SLM approach, along with new service models based on the latest technologies (such as the Internet of Things), enables brand owners to deliver higher quality service at lower cost, resulting in higher profit margins, enhanced brand image, and greater customer loyalty. Product or asset owners who are the end customers also benefit from increased utilization and productivity due to faster and more reliable service.

What do you think? Is your organization connected?

Today’s topic will focus a little on the licensing side of CATIA – namely CAT3DX and the theory of what it is here for.

Several years ago, Dassault changed the way they were packaging CATIA V5 by introducing PLM Express as a way to buy it; my colleague Jason Johnson explained this in a previous post. As he had mentioned, this was referred to as CATIA TEAM PLM and was really designed to allow for connecting current CATIA V5 users of their new PLM offering, which was ENOVIA SmarTeam.  He also wrote briefly about the configurations and trigrams that make up the software.  The easiest way to think about a trigram per se is to know that a group of trigrams make up a configuration, and trigrams by themselves give you access to particular workbenches – or in some cases only add toolbars to existing workbenches.

Why does this matter? Because there is a new sheriff in town called 3DEXPERIENCE. Much more than a PLM system, the 3DEXPERIENCE platform suite of tools will assist the user in management of their daily work, projects, processes, BOMs, documents, CAD, etc.  While an old CAC (CAT) license – which was the base configuration for PLM Express – would give you access to SmarTeam by bundling in TDM and TEM trigrams, the new CAT3Dx will now give you all of that, as well as access to the ENOVIA 3DEXPERIENCE Platform, by giving you the PCS and CNV trigrams as well. These are the minimum trigrams needed to connect to the platform (the price for admission).

The Dassault idea is still the same – help CATIA v5 users move away from file-based, directory-based storage (which has always presented its own challenges) and help companies regain control of their data via the new platform. The only caveat to this is that you would install ENOVIA to manage your data, which is not as simple as throwing in a set of discs like SmarTeam was. ENOVIA requires the setting up of a database using SQL or Oracle, and then configuration of the various pieces (web server, authentication, java virtual machine, etc.).  Once this has been configured, the base PCS, CNV combination gives you the ability to vault your data and set up workspaces for where and how it will be stored, as well as do some level of change management on it. (set up Change Requests and Routes for how your data will be routed) to get it through its life cycle to release.

Creation Menu

 

The ENOVIA applications that come with the PCS, CNV combination are Classify & Reuse, Collaboration & Approvals, Collaborative Lifecycle, Design IP Classification, Exchanges Management, My Collections, My Issues, My Route Tasks, and X-CAD Design. These are plenty enough to help your team begin to get to a single source of truth – meaning, never having to guess what state the latest data is in.

ENOVIA Apps

You also have access applications for business intelligence information. This includes access to the latest technology of Dashboards.  Dashboards are ways of viewing data configured to your liking.  Not at all unlike the old igoogle portal which allowed you to customize your view of news, etc. In 2012 Dassault acquired Netvibes.

netvibes

Information Intelligence

[…]

Autodesk Vault uses the concept of a “Local Workspace” whenever files are opened or checked out.  Essentially, whenever a Vault file is accessed, a copy is cached in the workspace on the user’s local workstation.  From a user perspective, the workspace can be ignored for much regular work.  There are several benefits of a local workspace.

  1. get-to-workspacePerformance improvement over network share – One of the problems without a PDM system is that files are opened directly across the network.  Files being accessed and edited are located on a network share, and stay there while being worked on.  In environments with multiple users working with large datasets, this can become a disaster.  When files are checked out from Vault, they are cached locally and the workstation’s drives are able to respond to changes more quickly than a network server.
  2. Offline workflows – The local workspace also allows users to retrieve data to work on while disconnected from their corporate network.  The local workspace actually acts much like a briefcase:  The user simply checks out files, disconnects from the network and works on them, and checks them back in when they return to the network and are logged back into Vault.
  3. Better distributed workforce management – For companies with distributed workforces, the local workspace can also be a big benefit.  Combining the performance and offline workflows really makes workflows possible with a distributed workforce.  All that is required is a remote VPN connection, and then files can be checked in and out of Vault.  The VPN doesn’t have to be permanently connected.  When disconnected, it will really be just like an offline workflow.  Since files that are checked out from Vault reside locally, the distributed users still have good performance while editing and saving their work.

 

This is a second look at the hidden intelligence of CATIA V5. Our topic today will focus on the creation and use of design tables. As I talked about in my last blog post, parameters and formulas can be used to drive your design from the specification tree based on your design intent. We will continue on using the rectangular tubing part and build several variations of that tubing that can be driven from a spreadsheet.

Design Table Icon

Most of the work has been already done, and although it is not necessary to have pre-defined parameters and formulas existing, the process is faster. We will begin by again looking at the Knowledge toolbar, this time focusing on the Design Table icon.

When the command is selected, a dialog appears asking for the name of the design table and also gives you a choice on whether or not you want to use a pre existing file or create one from the current parameter values.  The differences being whether or not you have an existing spreadsheet filled out already with all the tabulated values of what changes in each iteration of the design.

Design Table Dialog

 

In our case, to show the functionality we will choose the create with current parameter values option. Once that is decided, you choose which parameters you want to be driven by the spreadsheet.  In our case, we had some already created, so we changed the filter to User parameters, chose the values that were NOT driven by formulas (INSIDE and OUTSIDE RADII) and moved them to the inserted side by highlighting and clicking the arrow.

Parameters to Insert

At this point, we have defined that we want a spreadsheet to use columns for Height, Width, and Wall Thickness based on the current values in the model as it is at this moment. When we click OK on the dialog, it will ask us where we want to save the spreadsheet. I suggest that you do this in a place where anyone who uses the model can has at least read access to (i.e. a network drive).  Note that I can also change the type of file to a .txt if I do not have access to Excel® or any other software that can edit .xls files.

Read Access Directory

 

Once this has been defined, your design table is created, linked to your 3D model, and ready to be edited to include your alternate sizes. This is confirmed by the next dialog. To add in the other sizes, simply click on the Edit table… button and your editor (Excel or Notepad) should launch and simply fill in rows with your values.

Linked and ready to edit

Once you have edited and saved the values, you can close that software and CATIA will update based on your values.

Excel Modifications

 

CATIA Updated

Now you would just pick the value set you want and click OK for the change to appear on the screen.

File Updated

At any time, you can always go to make the changes by finding the Design Table under the Relations section of the specification tree and double-clicking on it.

Design Table under Relations

As you can see, it’s pretty easy to create a design table and drive your parametric file with multiple values. The world of CATIA V5 is all about re-use of data and capturing business intelligence we already know exists in all companies.  How can we help you? Tata Technologies has helped many companies time and again.

Stay tuned for Part 3!

 

 

 

 

 

 

This post was originally created on December 8, 2016.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding common practices and techniques. This week’s blog post will address a common type of 3D printing known as Fused Deposition Modeling (FDM).

But first, What is Additive Manufacturing?

Additive manufacturing is the process of creating a part by laying down a series of successive cross-sections (a 2D “sliced” section of a part). It came into the manufacturing world about 35 years ago in the early 1980s, and was adapted more widely later in the decade. Another common term used to describe additive manufacturing is 3D Printing – a term which originally referred to a specific process, but is now used to describe all similar technologies.

Now that we’ve covered the basics of 3D Printing, What is Fused Deposition Modeling?

It is actually part of a broader category, commonly referred to as a Filament Extrusion Techniques.  Filament extrusion techniques all utilize a thin filament or wire of material. The material, typically a thermoplastic polymer, is forced through a heating element, and is extruded out in 2D cross section on a platform. The platform is lowered and the process is repeated until a part is completed. In most commercial machines, and higher-end consumer grade machines, the build area is typically kept at an elevated temperature to prevent part defects (more on this later). The most common form, and the first technology of this type to be developed, is FDM.

The Fused Deposition Modeling Technique was developed by S. Scott Crump, co-founder of Stratasys, Ltd. in the late 1980s. The technology was then patented in 1989. The patent for FDM expired in the early 2000s. This helped to give rise to the Maker movement, by allowing other companies to commercialize the technology.

It should also be noted that Fused Deposition Modeling is also known as Fused Filament Fabrication, or FFF. This term was coined by the Reprap community, because Stratasys has a trademark on Fused Deposition Modeling.

What Are the Advantages of this Process? […]

I mentioned the process automation concept of ISight in a previous simulation automation blog. ISight is an open source code simulation automation and parametric optimization tool to create workflows that automate the repetitive process of model update and job submission with certain objectives associated with it. The objective could be achievement of an optimal design through any of the available techniques in ISight: Design of experiments, optimization, Monte Carlo simulation or Six Sigma. In this blog post, I will be discussing various value added algorithms in DOE technique; I will discuss other techniques in future blogs.

Why design of experiments

Real life engineering models are associated with multiple design variables and with multiple responses. There are two ways to evaluate the effect of change in design variable on response: Vary one at a time (VOAT) approach or Design of experiments (DOE) approach. The VOAT approach is not viable because:

  • This approach ignores interactions among design variables, averaged and non-linear effects.
  • In models associated with large FE entities, each iteration is very expensive. VOAT does not offer the option of creating high fidelity models with a manageable number of iterations.

With the DOE approach, user can study the design space efficiently, can manage multi dimension design space and can select design points intelligently vs. manual guessing. The objective of any DOE technique is to generate an experimental matrix using formal proven methods. The matrix explores design space and each technique creates a design matrix differently. There are multiple techniques which will be discussed shortly and they are classified into two broad configurations:

  • Configuration 1: User defines the number of levels and their values for each design variable. The chosen technique and number of variables determines number of experiments.
  • Configuration 2: User defines the number of experiments and design variables range.

Box-Behnken Technique

This is a three level factorial design consisting of orthogonal blocks that excludes extreme points. Box-Behnken designs are typically used to estimate the coefficients of a second-degree polynomial. The designs either meet, or approximately meet, the criterion of rotatability. Since Box-Behnken designs do not include any extreme (corner) point, these designs are particularly useful in cases where the corner points are either numerically unstable or infeasible. Box-Behnken designs are available only for three to twenty-one factors.untitled

Central Composite Design Technique […]

Watch and learn as i GET IT subject matter expert Eric Bansen walks through the Context Toolbar feature in CATIA V6 2016X 3D Experience. The Context Toolbar feature allows you to create and manipulate geometry with in your part model. This session is a great for those who are currently learning or want to know more about CATIA V6 2016X 3D Experience. For further interest in CATIA V6 2016X 3D Experience, there is the i GET IT course “3D Experience CATIA V6 2016X New Essentials“.

Click to Join

If you are in the business of designing and engineering product, then you have PLM. This is a statement of fact. The question then becomes: what is the technology underpinning the PLM process that is used to control your designs?

Because of the way that technology changes and matures, most organizations have a collection of software and processes that support their PLM processes. This can be called the Point Solution approach. Consider a hypothetical setup below:

The advantage of this approach is that point solutions can be individually optimized for a given process – so, in the example above, the change management system can be set up to exactly mirror the internal engineering change process.

However, this landscape also has numerous disadvantages:

  1. Data often has to be transferred between different solutions (e.. what is the precise CAD model tied to a specific engineering change?). These integrations are difficult to set up and maintain – sometimes to the point of being manual tasks.
  2. The organization has to deal with multiple vendors.
  3. Multiple PLM systems working together require significant internal support resource from an IT department.
  4. Training and onboarding of new staff is complicated

The alternative to this approach is a PLM Platform. Here, one technology solution includes all necessary PLM functionalities. The scenario is illustrated below:

It is clear that the PLM Platform does away with many of the disadvantages of the Point Solution; there is only one vendor to deal with, integrations are seamless, training is simplified, and support should be easier.

However, the PLM Platform may not provide the best solution for a given function when compared to the corresponding point solution. For example, a dedicated project management software may do a better job at Program Management than the functionality in the PLM Platform; this may require organizational compromise. You are also, to some extent, betting on a single technology vendor and hoping that they remain an industry leader.

Some of the major PLM solution vendors have placed such bets on the platform strategy. For example, Siemens PLM have positioned Teamcenter as a complete platform solution covering all aspects of the PLM process. (refer to my earlier blog post What is Teamcenter? or, Teamcenter Explained). All of the PLM processes that organizations need can be supported by Teamcenter.

Dassault Systèmes have pursued a similar approach with the launch of their 3DEXPERIENCE platform, which also contains all of the functions required for PLM. In addition, both are actively integrating additional functionality with every new release.

So what is your strategy – Point or Platform? This question deserves serious consideration when considering PLM processes in your organization.

For many years, finite element modeling has been the job of a specialist; the tools used to perform even simple finite element analysis have been complex enough to require a subject matter expert. This is primarily due to the complex, difficult to understand graphical user interfaces of these products. The job is made further difficult to perform due to the requirement of advanced engineering subject knowledge by the analyst.

Can a mechanical designer who uses CAD tools to create engineering drawings be trained to perform engineering simulations?

In today’s product availability scenario, the answer is “yes.”

A CAD designer using CATIA can create and execute simple finite element models within the CATIA environment by using CATIA workbenches that have been developed for simulations. This makes it intuitive and easier for designers to ensure that their parts meet their design requirements.

untitled

How the simulation methodology gets simplified using designer level tools

  • No need of an expert level analyst tool to perform simple finite element simulation.
  • No need of manual data transfer between design and analysis departments.
  • No need of geometry clean up tools to fix data translation errors.

There are obvious benefits to adopting this simplified approach that integrates the design and analysis environments. The designer can predict design problem early in design process; subsequently the designer can check various alternatives of design in less time. This is primarily due to the tight integration of designer level tools with knowledge based engineering that allows the designer to deliver better product in less time.

Part Level Simulation

From a geometrical perspective, the simulation model can be generated at part level to begin with. The native integration within CATIA allows users to perform stress, displacement, and vibration analysis at any time in the design process, allowing more accurate sizing of parts and fewer design iterations. Individual parts consisting of solid, surface, and wireframe geometries can be analyzed under a variety of loading conditions. The analysis specifications, such as loads and restraints, are associative, with the design allowing users to perform analyses quickly and easily. These specifications are then automatically incorporated into the underlying finite element model, meaning that users do not have to work directly with the finite element model. “Virtual parts” allow items like forces, moments, and restraints to be easily modeled without having to have a detailed geometric representation.

Standard reports can be automatically generated in HTML format, providing clear and detailed information about the results of the analysis, including images associated with the computations. These reports can be used to document the analyses that have been performed and to communicate the results of the analysis to other stakeholders in the organization. CATIA V5 Analysis users benefit naturally from the overall PLM solution provided by Dassault Systèmes, including ENOVIA V5 for data and product lifecycle management. CATIA V5 Analysis users can store, manage, and version all the data associated with their product’s simulation and share the information within the extended enterprise. This unique capability allows collaboration and provides access to advanced PLM practices such as concurrent engineering and change management.

untitled

     Assembly level simulation

 If the concept of virtual parts does not hold good anymore and the complexities of various parts interacting with each other make assembly level simulation mandatory, it is possible to create analysis models for assemblies as well. The analysis of assemblies, including an accurate representation of the way the parts interact and are connected, allows for more realistic and accurate simulation. The designer does not have to make simplifying assumptions about the loading and restraints acting on an individual part. Instead the part can be analyzed within the environment that it operates with the loading automatically determined based on the way the part is connected to and interacts with surrounding parts.

The various types of connections that can be modeled include bolted connections, welded connections, pressure fitting connections, and many more. To make the job further easier for the designer, these connections can be defined using assembly level constraints that already exist in the CAT Product model. Once the design changes, the associated assembly constraints as well as corresponding FEA connections get updated, thereby creating an updated FEA model that is ready for analysis.

         Concurrent engineering made easier 

The “assembly of analysis” capability enables concurrent engineering. For example, the various parts in an assembly can be modeled and meshed separately by different users. They can either use the CATIA V5 meshing tools or import orphan meshes (meshes that don’t have any geometry associated with them) developed outside of CATIA Analysis using a variety of different modeling tools. The user responsible for analyzing the assembly can consolidate the different meshes, connect the parts, apply the loading specifications, and run the simulation. This can significantly reduce the turnaround time when analyzing large assemblies, particularly since some of the parts may have already been analyzed and therefore, the analysis models would already be available.

untitled

Extended solver capabilities

The basic level FEA solver present in the CATIA designer workbench is called the “Elfini” solver and can model only simpler physical problems such as linear materials, small deformations, small rotations and bonded contacts; real life problems can be much more complex and may necessitate the need of an advanced solver. To address such scenarios it is possible to include the well known non-linear solver Abaqus into the CATIA designer environment; it can model the effects of geometric nonlinearity, such as large displacements, and allows nonlinear materials to be included, such as the yielding of metals and nonlinear elastic materials like rubber. It also offers more advanced contact capabilities including the ability to model large relative sliding of surfaces in contact.

The Abaqus capability enables the effect of multiple steps to be analyzed, where the loading, restraints, contact conditions, etc., vary from one step to the next. This powerful technique allows complex loading sequences to be modeled. For example, a pressure vessel might be subjected to an initial bolt tightening step, followed by internal pressurization, and conclude with thermal loading.

untitled

 

My last post outlined on deriving more value out of PLM data through reports. The complexity of data in the engineering environment is skyrocketing, and Teamcenter as a PLM system provides advanced reporting capabilities for enterprise data, including the data managed in external systems like MRP & ERP.

The Teamcenter Report Builder application provides basic reporting capabilities for data managed inside Teamcenter. It supports two kinds of reports:

  1. Summary Reports
  • Reports that summarize similar information – forreportbuilder example, reports that show all the employees, all the items belonging to a user, or the release status of items
  • Context Independent reports – no object selection required
  • Generated from Teamcenter saved queries
  1. Item Reports
  • Reports that can be run on a particular object – for example, BOM or workflow information for a given object
  • Executed in the context of one or more objects

Behind the scenes, Report Builder uses Teamcenter queries based data dump and supports output to common formats like Excel, XML, Text and HTML.  It’s easy to build these simple reports based on Teamcenter queries, and they can be run from both rich client and Active Workspace client.  Excel can be further leveraged for complex processing, charting, and aggregation of the output.

tcraThe Teamcenter Reporting & Analytics module provides additional advanced reporting capabilities. It can summarize information and present data from many sources in a single report using an easy to build, configurable, drag and drop layout.

It can leverage standard formatting tools like headers/footers, dates, page numbers, report names, filters, tables, charts, and elements. Reports can be run from both Active Workspace or Teamcenter Reporting & Analytics client. It has business intelligence designed for Teamcenter and to understand the relationships and associations of PLM information.tcra1 It comes with over 100 out-of-the-box reports in areas like Change Management, BOM Reports, Substance compliance, Workflow, Administrator Reports, Verification Management, PMM, Schedule Manager, Requirements Manager. It supports powerful and fast BOM reporting, project planning and status reporting and dashboards, process and change reporting.

It has direct access to Teamcenter data through APIs and has connectors to standard enterprise applications. It can also enforce data security based on the Teamcenter access model.  Additional capabilities include:

  • Customized Analytics
    • Organization-specific process status metrics and KPIs
    • Multi-level root-cause analysis
    • Mean time between failure / to failure (MTBF, MTTF) analysis
    • Historical Performance Analysis
  • Reporting Control
    • Save Snapshots of pre-defined reports
    • Group/Role based Access to report data
    • User Controlled Conditional Formatting
  • Resource Management
    • Automated Report Scheduler and Delivery
    • Submit Analysis to queue for load management
    • Caching Techniques to reuse data cubes

Teamcenter Reporting & Analytics benefits include:

  • Analytics, Dashboards and Traditional Reporting – understand your data to improve your products and processes
  • Time to Value – start with pre-configured reports and enable custom reports for your business in a couple of weeks, not months or years
  • Designed for Teamcenter – enable your entire enterprise to easily understand the information they require to make better decisions
  • Self Service Analytics – enable data discovery through self service analytics designed for Teamcenter and optimized to your needs

© Tata Technologies 2009-2015. All rights reserved.