Autodesk Vault uses the concept of a “Local Workspace” whenever files are opened or checked out.  Essentially, whenever a Vault file is accessed, a copy is cached in the workspace on the user’s local workstation.  From a user perspective, the workspace can be ignored for much regular work.  There are several benefits of a local workspace.

  1. get-to-workspacePerformance improvement over network share – One of the problems without a PDM system is that files are opened directly across the network.  Files being accessed and edited are located on a network share, and stay there while being worked on.  In environments with multiple users working with large datasets, this can become a disaster.  When files are checked out from Vault, they are cached locally and the workstation’s drives are able to respond to changes more quickly than a network server.
  2. Offline workflows – The local workspace also allows users to retrieve data to work on while disconnected from their corporate network.  The local workspace actually acts much like a briefcase:  The user simply checks out files, disconnects from the network and works on them, and checks them back in when they return to the network and are logged back into Vault.
  3. Better distributed workforce management – For companies with distributed workforces, the local workspace can also be a big benefit.  Combining the performance and offline workflows really makes workflows possible with a distributed workforce.  All that is required is a remote VPN connection, and then files can be checked in and out of Vault.  The VPN doesn’t have to be permanently connected.  When disconnected, it will really be just like an offline workflow.  Since files that are checked out from Vault reside locally, the distributed users still have good performance while editing and saving their work.

 

This is a second look at the hidden intelligence of CATIA V5. Our topic today will focus on the creation and use of design tables. As I talked about in my last blog post, parameters and formulas can be used to drive your design from the specification tree based on your design intent. We will continue on using the rectangular tubing part and build several variations of that tubing that can be driven from a spreadsheet.

Design Table Icon

Most of the work has been already done, and although it is not necessary to have pre-defined parameters and formulas existing, the process is faster. We will begin by again looking at the Knowledge toolbar, this time focusing on the Design Table icon.

When the command is selected, a dialog appears asking for the name of the design table and also gives you a choice on whether or not you want to use a pre existing file or create one from the current parameter values.  The differences being whether or not you have an existing spreadsheet filled out already with all the tabulated values of what changes in each iteration of the design.

Design Table Dialog

 

In our case, to show the functionality we will choose the create with current parameter values option. Once that is decided, you choose which parameters you want to be driven by the spreadsheet.  In our case, we had some already created, so we changed the filter to User parameters, chose the values that were NOT driven by formulas (INSIDE and OUTSIDE RADII) and moved them to the inserted side by highlighting and clicking the arrow.

Parameters to Insert

At this point, we have defined that we want a spreadsheet to use columns for Height, Width, and Wall Thickness based on the current values in the model as it is at this moment. When we click OK on the dialog, it will ask us where we want to save the spreadsheet. I suggest that you do this in a place where anyone who uses the model can has at least read access to (i.e. a network drive).  Note that I can also change the type of file to a .txt if I do not have access to Excel® or any other software that can edit .xls files.

Read Access Directory

 

Once this has been defined, your design table is created, linked to your 3D model, and ready to be edited to include your alternate sizes. This is confirmed by the next dialog. To add in the other sizes, simply click on the Edit table… button and your editor (Excel or Notepad) should launch and simply fill in rows with your values.

Linked and ready to edit

Once you have edited and saved the values, you can close that software and CATIA will update based on your values.

Excel Modifications

 

CATIA Updated

Now you would just pick the value set you want and click OK for the change to appear on the screen.

File Updated

At any time, you can always go to make the changes by finding the Design Table under the Relations section of the specification tree and double-clicking on it.

Design Table under Relations

As you can see, it’s pretty easy to create a design table and drive your parametric file with multiple values. The world of CATIA V5 is all about re-use of data and capturing business intelligence we already know exists in all companies.  How can we help you? Tata Technologies has helped many companies time and again.

Stay tuned for Part 3!

 

 

 

 

 

 

This post was originally created on December 8, 2016.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding common practices and techniques. This week’s blog post will address a common type of 3D printing known as Fused Deposition Modeling (FDM).

But first, What is Additive Manufacturing?

Additive manufacturing is the process of creating a part by laying down a series of successive cross-sections (a 2D “sliced” section of a part). It came into the manufacturing world about 35 years ago in the early 1980s, and was adapted more widely later in the decade. Another common term used to describe additive manufacturing is 3D Printing – a term which originally referred to a specific process, but is now used to describe all similar technologies.

Now that we’ve covered the basics of 3D Printing, What is Fused Deposition Modeling?

It is actually part of a broader category, commonly referred to as a Filament Extrusion Techniques.  Filament extrusion techniques all utilize a thin filament or wire of material. The material, typically a thermoplastic polymer, is forced through a heating element, and is extruded out in 2D cross section on a platform. The platform is lowered and the process is repeated until a part is completed. In most commercial machines, and higher-end consumer grade machines, the build area is typically kept at an elevated temperature to prevent part defects (more on this later). The most common form, and the first technology of this type to be developed, is FDM.

The Fused Deposition Modeling Technique was developed by S. Scott Crump, co-founder of Stratasys, Ltd. in the late 1980s. The technology was then patented in 1989. The patent for FDM expired in the early 2000s. This helped to give rise to the Maker movement, by allowing other companies to commercialize the technology.

It should also be noted that Fused Deposition Modeling is also known as Fused Filament Fabrication, or FFF. This term was coined by the Reprap community, because Stratasys has a trademark on Fused Deposition Modeling.

What Are the Advantages of this Process? […]

I mentioned the process automation concept of ISight in a previous simulation automation blog. ISight is an open source code simulation automation and parametric optimization tool to create workflows that automate the repetitive process of model update and job submission with certain objectives associated with it. The objective could be achievement of an optimal design through any of the available techniques in ISight: Design of experiments, optimization, Monte Carlo simulation or Six Sigma. In this blog post, I will be discussing various value added algorithms in DOE technique; I will discuss other techniques in future blogs.

Why design of experiments

Real life engineering models are associated with multiple design variables and with multiple responses. There are two ways to evaluate the effect of change in design variable on response: Vary one at a time (VOAT) approach or Design of experiments (DOE) approach. The VOAT approach is not viable because:

  • This approach ignores interactions among design variables, averaged and non-linear effects.
  • In models associated with large FE entities, each iteration is very expensive. VOAT does not offer the option of creating high fidelity models with a manageable number of iterations.

With the DOE approach, user can study the design space efficiently, can manage multi dimension design space and can select design points intelligently vs. manual guessing. The objective of any DOE technique is to generate an experimental matrix using formal proven methods. The matrix explores design space and each technique creates a design matrix differently. There are multiple techniques which will be discussed shortly and they are classified into two broad configurations:

  • Configuration 1: User defines the number of levels and their values for each design variable. The chosen technique and number of variables determines number of experiments.
  • Configuration 2: User defines the number of experiments and design variables range.

Box-Behnken Technique

This is a three level factorial design consisting of orthogonal blocks that excludes extreme points. Box-Behnken designs are typically used to estimate the coefficients of a second-degree polynomial. The designs either meet, or approximately meet, the criterion of rotatability. Since Box-Behnken designs do not include any extreme (corner) point, these designs are particularly useful in cases where the corner points are either numerically unstable or infeasible. Box-Behnken designs are available only for three to twenty-one factors.untitled

Central Composite Design Technique […]

Watch and learn as i GET IT subject matter expert Eric Bansen walks through the Context Toolbar feature in CATIA V6 2016X 3D Experience. The Context Toolbar feature allows you to create and manipulate geometry with in your part model. This session is a great for those who are currently learning or want to know more about CATIA V6 2016X 3D Experience. For further interest in CATIA V6 2016X 3D Experience, there is the i GET IT course “3D Experience CATIA V6 2016X New Essentials“.

Click to Join

If you are in the business of designing and engineering product, then you have PLM. This is a statement of fact. The question then becomes: what is the technology underpinning the PLM process that is used to control your designs?

Because of the way that technology changes and matures, most organizations have a collection of software and processes that support their PLM processes. This can be called the Point Solution approach. Consider a hypothetical setup below:

The advantage of this approach is that point solutions can be individually optimized for a given process – so, in the example above, the change management system can be set up to exactly mirror the internal engineering change process.

However, this landscape also has numerous disadvantages:

  1. Data often has to be transferred between different solutions (e.. what is the precise CAD model tied to a specific engineering change?). These integrations are difficult to set up and maintain – sometimes to the point of being manual tasks.
  2. The organization has to deal with multiple vendors.
  3. Multiple PLM systems working together require significant internal support resource from an IT department.
  4. Training and onboarding of new staff is complicated

The alternative to this approach is a PLM Platform. Here, one technology solution includes all necessary PLM functionalities. The scenario is illustrated below:

It is clear that the PLM Platform does away with many of the disadvantages of the Point Solution; there is only one vendor to deal with, integrations are seamless, training is simplified, and support should be easier.

However, the PLM Platform may not provide the best solution for a given function when compared to the corresponding point solution. For example, a dedicated project management software may do a better job at Program Management than the functionality in the PLM Platform; this may require organizational compromise. You are also, to some extent, betting on a single technology vendor and hoping that they remain an industry leader.

Some of the major PLM solution vendors have placed such bets on the platform strategy. For example, Siemens PLM have positioned Teamcenter as a complete platform solution covering all aspects of the PLM process. (refer to my earlier blog post What is Teamcenter? or, Teamcenter Explained). All of the PLM processes that organizations need can be supported by Teamcenter.

Dassault Systèmes have pursued a similar approach with the launch of their 3DEXPERIENCE platform, which also contains all of the functions required for PLM. In addition, both are actively integrating additional functionality with every new release.

So what is your strategy – Point or Platform? This question deserves serious consideration when considering PLM processes in your organization.

For many years, finite element modeling has been the job of a specialist; the tools used to perform even simple finite element analysis have been complex enough to require a subject matter expert. This is primarily due to the complex, difficult to understand graphical user interfaces of these products. The job is made further difficult to perform due to the requirement of advanced engineering subject knowledge by the analyst.

Can a mechanical designer who uses CAD tools to create engineering drawings be trained to perform engineering simulations?

In today’s product availability scenario, the answer is “yes.”

A CAD designer using CATIA can create and execute simple finite element models within the CATIA environment by using CATIA workbenches that have been developed for simulations. This makes it intuitive and easier for designers to ensure that their parts meet their design requirements.

untitled

How the simulation methodology gets simplified using designer level tools

  • No need of an expert level analyst tool to perform simple finite element simulation.
  • No need of manual data transfer between design and analysis departments.
  • No need of geometry clean up tools to fix data translation errors.

There are obvious benefits to adopting this simplified approach that integrates the design and analysis environments. The designer can predict design problem early in design process; subsequently the designer can check various alternatives of design in less time. This is primarily due to the tight integration of designer level tools with knowledge based engineering that allows the designer to deliver better product in less time.

Part Level Simulation

From a geometrical perspective, the simulation model can be generated at part level to begin with. The native integration within CATIA allows users to perform stress, displacement, and vibration analysis at any time in the design process, allowing more accurate sizing of parts and fewer design iterations. Individual parts consisting of solid, surface, and wireframe geometries can be analyzed under a variety of loading conditions. The analysis specifications, such as loads and restraints, are associative, with the design allowing users to perform analyses quickly and easily. These specifications are then automatically incorporated into the underlying finite element model, meaning that users do not have to work directly with the finite element model. “Virtual parts” allow items like forces, moments, and restraints to be easily modeled without having to have a detailed geometric representation.

Standard reports can be automatically generated in HTML format, providing clear and detailed information about the results of the analysis, including images associated with the computations. These reports can be used to document the analyses that have been performed and to communicate the results of the analysis to other stakeholders in the organization. CATIA V5 Analysis users benefit naturally from the overall PLM solution provided by Dassault Systèmes, including ENOVIA V5 for data and product lifecycle management. CATIA V5 Analysis users can store, manage, and version all the data associated with their product’s simulation and share the information within the extended enterprise. This unique capability allows collaboration and provides access to advanced PLM practices such as concurrent engineering and change management.

untitled

     Assembly level simulation

 If the concept of virtual parts does not hold good anymore and the complexities of various parts interacting with each other make assembly level simulation mandatory, it is possible to create analysis models for assemblies as well. The analysis of assemblies, including an accurate representation of the way the parts interact and are connected, allows for more realistic and accurate simulation. The designer does not have to make simplifying assumptions about the loading and restraints acting on an individual part. Instead the part can be analyzed within the environment that it operates with the loading automatically determined based on the way the part is connected to and interacts with surrounding parts.

The various types of connections that can be modeled include bolted connections, welded connections, pressure fitting connections, and many more. To make the job further easier for the designer, these connections can be defined using assembly level constraints that already exist in the CAT Product model. Once the design changes, the associated assembly constraints as well as corresponding FEA connections get updated, thereby creating an updated FEA model that is ready for analysis.

         Concurrent engineering made easier 

The “assembly of analysis” capability enables concurrent engineering. For example, the various parts in an assembly can be modeled and meshed separately by different users. They can either use the CATIA V5 meshing tools or import orphan meshes (meshes that don’t have any geometry associated with them) developed outside of CATIA Analysis using a variety of different modeling tools. The user responsible for analyzing the assembly can consolidate the different meshes, connect the parts, apply the loading specifications, and run the simulation. This can significantly reduce the turnaround time when analyzing large assemblies, particularly since some of the parts may have already been analyzed and therefore, the analysis models would already be available.

untitled

Extended solver capabilities

The basic level FEA solver present in the CATIA designer workbench is called the “Elfini” solver and can model only simpler physical problems such as linear materials, small deformations, small rotations and bonded contacts; real life problems can be much more complex and may necessitate the need of an advanced solver. To address such scenarios it is possible to include the well known non-linear solver Abaqus into the CATIA designer environment; it can model the effects of geometric nonlinearity, such as large displacements, and allows nonlinear materials to be included, such as the yielding of metals and nonlinear elastic materials like rubber. It also offers more advanced contact capabilities including the ability to model large relative sliding of surfaces in contact.

The Abaqus capability enables the effect of multiple steps to be analyzed, where the loading, restraints, contact conditions, etc., vary from one step to the next. This powerful technique allows complex loading sequences to be modeled. For example, a pressure vessel might be subjected to an initial bolt tightening step, followed by internal pressurization, and conclude with thermal loading.

untitled

 

My last post outlined on deriving more value out of PLM data through reports. The complexity of data in the engineering environment is skyrocketing, and Teamcenter as a PLM system provides advanced reporting capabilities for enterprise data, including the data managed in external systems like MRP & ERP.

The Teamcenter Report Builder application provides basic reporting capabilities for data managed inside Teamcenter. It supports two kinds of reports:

  1. Summary Reports
  • Reports that summarize similar information – forreportbuilder example, reports that show all the employees, all the items belonging to a user, or the release status of items
  • Context Independent reports – no object selection required
  • Generated from Teamcenter saved queries
  1. Item Reports
  • Reports that can be run on a particular object – for example, BOM or workflow information for a given object
  • Executed in the context of one or more objects

Behind the scenes, Report Builder uses Teamcenter queries based data dump and supports output to common formats like Excel, XML, Text and HTML.  It’s easy to build these simple reports based on Teamcenter queries, and they can be run from both rich client and Active Workspace client.  Excel can be further leveraged for complex processing, charting, and aggregation of the output.

tcraThe Teamcenter Reporting & Analytics module provides additional advanced reporting capabilities. It can summarize information and present data from many sources in a single report using an easy to build, configurable, drag and drop layout.

It can leverage standard formatting tools like headers/footers, dates, page numbers, report names, filters, tables, charts, and elements. Reports can be run from both Active Workspace or Teamcenter Reporting & Analytics client. It has business intelligence designed for Teamcenter and to understand the relationships and associations of PLM information.tcra1 It comes with over 100 out-of-the-box reports in areas like Change Management, BOM Reports, Substance compliance, Workflow, Administrator Reports, Verification Management, PMM, Schedule Manager, Requirements Manager. It supports powerful and fast BOM reporting, project planning and status reporting and dashboards, process and change reporting.

It has direct access to Teamcenter data through APIs and has connectors to standard enterprise applications. It can also enforce data security based on the Teamcenter access model.  Additional capabilities include:

  • Customized Analytics
    • Organization-specific process status metrics and KPIs
    • Multi-level root-cause analysis
    • Mean time between failure / to failure (MTBF, MTTF) analysis
    • Historical Performance Analysis
  • Reporting Control
    • Save Snapshots of pre-defined reports
    • Group/Role based Access to report data
    • User Controlled Conditional Formatting
  • Resource Management
    • Automated Report Scheduler and Delivery
    • Submit Analysis to queue for load management
    • Caching Techniques to reuse data cubes

Teamcenter Reporting & Analytics benefits include:

  • Analytics, Dashboards and Traditional Reporting – understand your data to improve your products and processes
  • Time to Value – start with pre-configured reports and enable custom reports for your business in a couple of weeks, not months or years
  • Designed for Teamcenter – enable your entire enterprise to easily understand the information they require to make better decisions
  • Self Service Analytics – enable data discovery through self service analytics designed for Teamcenter and optimized to your needs

item-documentationLet’s review the role of the Item Master in managing components and all of the relevant documentation in Autodesk Vault.  There are three main uses for the Item Master in Vault:

  1. Container for all relevant documentation – Items as a concept in Vault are really nothing more than a container for all the relevant documentation related to a component. This could be a PDF file, AutoCAD drawing, or Inventor part and drawing.  This is most commonly done by promoting a document to an Item, where it is assigned an item number.  If an Inventor part or assembly is promoted, the associated drawing is also captured, and this begins the process of capturing all the relevant documentation.
  2. Mechanism for release management – Like individual files, Items also have their own workflows and release process. So rather than trying to manage the release of each individual file, the entire package of relevant documentation can be released from the item level instead.
  3. item-bomCommon BOM format for communication to other business systems – Items also allow the management of a Bill of Materials (BOM). A BOM can be built from scratch from multiple items; however, this is more commonly automated from Autodesk Inventor file relationships.  An Inventor top-level assembly will automatically generate the beginning of a BOM in the Item Master.  This BOM can then be edited to add extra items or change quantities if desired. This BOM can also be exported to a neutral format for communication to other business systems such as ERP or MRP.

So you’re a manager at a manufacturing company. You make things that are useful to your customers and you answer to the executives regarding matters such as budgets, efficiencies, timelines and deliverables. You will have at least heard of PLM; perhaps you have attended a conference or two. But how badly do you need to implement it or retool an existing setup?

Here are 10 indicators:

  1. Your staff is always late meeting deadlines. This results from poorly executed projects, inefficient processes, and lack of clear deliverables. All of these problems can be addressed by a PLM system, starting with the enforcement of common processes and followed up by proper project planning.
  2. Department costs are creeping up. You are held to a tight budget by the organization. You are always close to or exceed your budget and it is difficult to get a handle on why. A PLM system can help this in two ways: more efficient processes leading to greater productivity and by providing better information to managers.
  3. Rework is rampant. A lot of work needs to be repeated or reworked because it was not correct the first time. A PLM system can certainly help with this problem by supporting common working practices and introducing checks at crucial points.
  4. Your department is constantly battling other departments. There is a lot of finger pointing and blame that goes around the organization. No one can pin down who is responsible or when information was provided. PLM can provide automatic notifications, timestamped deliverables, and clear and unequivocal instructions.
  5. There is no accountability in your department. It is difficult to diagnose where mistakes were made and who is responsible. People are always blaming other people. A PLM system can provide objective data that allows the root cause of accountability to be addressed.
  6. Overtime is out of control. Excessive overtime worked in your department is always a concern. A PLM system can help improve productivity and give managers better information regarding where inefficiencies exist.
  7. Your competitors always seem better. Your bosses are always holding you up against your competition and showing how they are better. A PLM system can put you ahead because there’s a good chance the competition do not have a PLM system, or have not made good use of it if they do.
  8. External customers complain that they do not get the information they need. You owe your customers information at various stages during the design cycle and they often don’t receive it in a timely manner. A suitably configured PLM system can improve this dramatically.
  9. Your suppliers provide the wrong information. You are constantly going around in circles with your suppliers regarding information. But do your suppliers have the right capabilities to begin with, and do you have the capability to meet them on the same terms? PLM technology can bridge this gap.
  10. Process adherence is poor. Although you have some level of documented processes, adherence is poor. A correctly configured PLM system can fix this quickly.

Do you have three or more of these issues keeping you up at night? Time to take a serious look at a PLM system.

© Tata Technologies 2009-2015. All rights reserved.