This is an exciting post for me! Dassault has just come out with a couple of new bundles that blow the doors off anything I have seen previously.

CATMEE – Mechanical Engineering Excellence

The first package is named CATMEE; this would be the “Mechanical” version of the package. In Classic terms, previously for this purpose I would have recommended an MD2 trigram.  In PLM Express bundles, I would have recommended a CAC+MCE bundle to these types of users. They are typically heavy on the mechanical solid modeling portion of CATIA, and do not do very much surfacing.  CATMEE is a CAC+MCE on steroids! It includes CAT3DX (which I talked more about in my last post) AND also includes bundles for FPE (Fabricated Product), JTE (Jig and Tool Creation), PRX (Animated Product Review), FTX (3D Master), and TRE (Technical Specifications Review).

CATMEE Package Bundle

I realize that this sounds like a bunch of trigram soup. What does it really mean in CATIA V5? Well from a workbench standpoint, the CAC+MCE add-on looks like this:

CAC+MCE Workbenches

From a workbench standpoint, CATMEE looks like this:

CATMEE Bundle Workbenches

Take a closer look: you get Sheet Metal, 3D GD&T functionality (the good one, FTA!), Mold Tooling, Structure Design, and also DMU! In fact Kinematics, Space Analysis and Fitting Simulation alone can get expensive as an add-on, but here it comes with the bundle. Imagine cutting a section and it actually still being there when you click OK, and being available in the specification tree and updates when you change your part, as well as clearance checks, interference checks, etc.  MD2 and/or CAC+MCE users know exactly what I am talking about!

If you are in the market for a new seat or two this year and you are a mechanical customer, you should talk to your account manager and ask about this package; the new configurations not only help your productivity, but also help you expand your capabilities of what kinds of parts and markets you can get into.

CATMSE – Mechanical and Shape Engineering Excellence

This package is where you will really get your bang for the buck! CATMSE is a package we would have previously bundled as either an HD2 (Classic) or CAC+MCE+HDX (PLM Express). It is designed more for the mechanical and surfacing (Hybrid) type of role as a designer. Traditionally CAC+MCE+HDX overall gave you the GSD version of the Generative Shape Design workbench (better sweep functions, laws, etc) as well as a DL1 (Developed Shapes Toolbar in GSD) and a light version of Freestyle workbench (FS1). […]

My last post outlined how an integrated product lifecycle management (PLM) and service lifecycle management (SLM) tool framework can benefit both Product development organizations (Brand owners) and customers (Asset owners) by  enabling higher quality service at lower cost, resulting in an increased product/asset utilization and productivity. Teamcenter as a leading PLM platform supports this vision. With Teamcenter SLM solutions, the service and support phase of the product lifecycle is included in your overall PLM vision. Teamcenter bridges the gaps between the engineering, logistics, manufacturing, and service communities. OEMs and service providers can drive more efficient service operations with a single source of knowledge for both products and assets

For OEMs, Teamcenter enables them to reuse design and manufacturing data to enhance service content and incorporate service feedback to support Design for Serviceability and other product improvement initiatives. This holistic approach to the full product lifecycle helps the OEM compete successfully in the service market.  Teamcenter unifies SLM with PLM to support bi-directional collaboration between product engineering and service operations. Service teams can capitalize on the re-use of product knowledge from engineering and manufacturing to improve service planning and execution. In return, service teams can provide feedback to engineering to improve product designs for serviceability and reliability.

For the third party service provider, the service data management and applications allow them to efficiently execute service activities in a global marketplace through a single service platform. Using configuration-driven BOM management, Teamcenter delivers a fully linked, full lifecycle BOM environment that includes the EBOM, SBOM (Service BOM), and Asset BOM to configure accurate information to support services. Different service disciplines can share a common understanding of support requirements and  Service teams can coordinate operational activities for greater compliance, faster service, and lower costs.

The highlights of the solution include:

Maximize Service Knowledge Management and Value

With Teamcenter as the core of your SLM strategy, you have one source of service knowledge management. You can perform service activities with a full understanding of physical product /asset configurations, status and service history. You can order the correct parts, ensure that the proper training is done, and access all the appropriate information necessary to manage service operations

Create Effective Service Plans

Service plans are the key to profitable service operations. Teamcenter provides you with the fundamentals to author and publish service documentation as the source of work scope definition. You can drive service operations by providing all the detailed information that teams need to track and understand asset health, such as service requirements, task-by-task procedures, necessary resources and utilization characteristics. Your technicians have a complete understanding of service needs from Teamcenter, so they are prepared to perform reactive, proactive and upgrade service activities

Optimize Service Work with Schedule Visibility

With the detailed service plans in Teamcenter, you can schedule service activities with a complete understanding of the work scope, in order to meet customer expectations for product availability and reliability. Work orders generated from service plans are used to create service schedules. It is the visibility into the schedule and resources provided by Teamcenter that allows you to optimize service events and ensure that the right resources (parts, qualified people and tools) are reserved for the work

Empower Service Technicians with Work Instructions

Service technicians are a limited resource. When you provide them with complete, intelligent work packages, technicians can execute service work efficiently, accurately and compliantly. With Teamcenter, you can deliver service work instructions, safety/hazard notes, and service procedures (text, 2D/3D and animations). You can also include asset configurations and data collection requirements. Technicians can enter data, observations or discrepancies, and digitally sign off on work, which automatically updates the service schedule.

FIRST Robotics Team 1706, the Ratchet Rockers, will compete this weekend (April 26-29)
in the FIRST Championship in St. Louis, MO.

The Ratchet Rockers completed in 2 events this year. In Huntsville, MO they qualified #4 out of 50 teams but lost in the quarterfinals. In their other event in St. Louis they qualified #2 out of 52 teams and ended up winning the regional which qualified them for the Championship in St. Louis.

Team 1706, the Ratchet Rockers

The robotics team is based in Wentzville, MO (just northwest of St. Louis) providing the opportunity to the kids in the Wentzville School District and pull students from 3 different high schools. They have approximately 50 students on the team and 6 full time CAD students as they started the school year.

One of our accomplishments this year is a great shooter

FRC mentor Mark Roberts was looking to expose the entire team to Autodesk Inventor and to address the learning needs of the existing experienced CAD users on the team. Mark planned on running a classroom led “Inventor 101” class but was in search for a solution where the students could access and learn independently after their class.

i GET IT and Tata Technologies are proud to be able to sponsor the Ratchet Rockers and provide the much needed training material. Supplying the team with 50 i GET IT Autodesk Annual Subscriptions, the entire team had the opportunity to learn Autodesk Inventor and any of our other Autodesk offerings independently and at their own pace.

One of our accomplishments this year is a great shooter. I believe we are somewhere around the 20th best robot in the world (approx. 4500 teams) when it comes to scoring shooting fuels (balls).”  said Mark Roberts of his team.

Following their regional win the team prototyped new and better functions for the robot so they can improve their chances against the 400-500 teams that will compete this weekend.

Here is video of this year’s game. Look for robot 1706, the one shooting all the balls.

 

Good Luck this weekend Ratchet Rockers!

One of the first things I typically discuss with customers concerning file management is the relationship between files in their engineering data.  This is especially the case when working with data from 3D CAD systems like Autodesk Inventor. When you have Assemblies, parts, drawings, and presentations all with linked file relationships, it can be extremely challenging to manage this data without a tool that understands and maintains all the file links.  Simply renaming a single file can cause all sorts of problems if done in Windows Explorer.  Here are some of the areas where file relationships matter.

  1. Part, Assy, Drawing – As previously mentioned, 3D CAD data can be a challenge to manage.  Simply understanding where a file is used (or linked) can be tremendously helpful.
    vault-where-used

    “Where Used” within Autodesk Vault

  2. Copy Design – There is a “copy design” tool in Autodesk Vault that can make it much easier to reuse existing designs in the creation of variants based on the original.  This also reduces the amount of duplicate data in Vault because so much more is reused rather than recreated.
  3. Renaming – In many workflows, files are initially created using descriptive filenames.  These files then need to be renamed once a design is approved and will go into production.  With Inventor data, renaming files in Windows Explorer will break the links between parts, assemblies, and drawings. The files then have to be manually relinked, which can become extremely troublesome if a file was used by more than one assembly without knowing it.  When someone opened up the other assembly, the file would be missing and very difficult to locate.  Vault simply fixes all the file references whenever a file is renamed so this isn’t a problem.
  4. Moving – Files that are moved in Windows Explorer can cause the same problems as renaming, but usually because of the way Inventor uses project files. Using Autodesk Vault with a single Vault type project file eliminates many of the challenges in moving files to more relevant or common locations.
  5. Attachments – Attachments in Vault can also be tracked.  One example might be a design specification document that might apply to a whole class of components.  The design spec can be attached to the relevant designs.  If the design spec document changes, you can simply do a “where used” from it to see which files will be impacted by the specification change.

“To specialize or not to specialize, that is the question.”

The question of specializing vs. generalizing has arisen in so many aspects: biology, health, higher education, and of course, software.  When one has to decide between the two ends of the spectrum, the benefits and risks must be weighed.

muskrat_eating_plantAs environments have changed over time, animals have had to make a decision: change or perish. Certain species adapted their biology to survive on plants – herbivores – others, meat 0 carnivores.  When in their preferred environments with ample resources, each can thrive.  However, if conditions in those environments change so that those resources are not as bountiful, they may die out. Then comes the omnivore, whose adaptation has enabled them to survive on either type of resource. With this wider capability of survival, there comes a cost of efficiency. The further you move up through the food chain, the less efficient the transfer of energy becomes.  Plants produce energy, only 10% of which an herbivore derives, and the carnivore that feeds on the herbivore only gets 10% of that 10%; i.e. 1% of the original energy.

Three hundred trout are needed to support one man for a year.
The trout, in turn, must consume 90,000 frogs, that must consume 27 million grasshoppers that live off of 1,000 tons of grass.
— G. Tyler Miller, Jr., American Chemist (1971)

doctor-1149149_640When it comes to deciding on a course of action for a given health problem, people have the option to go to their family doctor, a.k.a. general practitioner, or a specialist. There are “…reams of papers reporting that specialists have the edge when it comes to current knowledge in their area of expertise” (Turner and Laine, “Differences Between Generalists and Specialists“)., whereas the generalist, even if knowledgeable in the field, may lag behind the specialist and prescribe out-of-date – but still generally beneficial – treatments.  This begs the question, what value do we place on the level of expertise?  If you have a life-threatening condition, then a specialist would make sense; however, you wouldn’t see a cardiologist if your heart races after a walk up a flight of stairs – your family doctor could diagnose that you need some more exercise.

graduation-907565_640When it comes to higher education, this choice of specializing or not also exists: to have deep knowledge and experience in few areas, or a shallower understanding in a broad range of applications. Does the computer science major choose to specialize in artificial intelligence or networking? Or none at all? How about the music major?  Specialize in classical or German Polka? When making these decisions, goals should be decided upon first. What is it that drives the person? High salary in a booming market (hint: chances are that’s not German Polka)? Or is the goal pursuing a passion, perhaps at the cost of potential income? Or is it the ability to be valuable to many different types of employers in order to change as the markets do? It’s been shown that specialists may not always command a higher price tag; some employers value candidates that demonstrate they can thrive in a variety of pursuits.

Whether you’re looking to take advantage of specialized design products (for instance, sheet metal or wire harnesses), or gaining the value inherent in a general suite of tools present in a connected PLM platform that can do project management, CAPA, and Bill of Materials management, we have the means. A “Digital Engineering” benchmark can help you decide if specialized tools are right for your company. Likewise, our PLM Analytics benchmark can help you choose the right PLM system or sub-system to implement.

Specialize, or generalize? Which way are you headed and why?

In this era of new levels of globalization, product companies are faced with market pressures from global competition and price deflation. Today they seek alternate sources of profitable revenue growth enabled by value-add service products. Developing a service-based revenue stream and then delivering product service that is both effective and profitable has its own challenges, however. Even mature service organizations are seeking new approaches to reach a significantly higher quality of service delivery.

Today in a typical product company, there is no single application to manage the the data and the decision points required to deliver effective service. Multiple enterprise applications including PLM, ERP, and often a combination of local databases, spreadsheets and stand alone IT systems are involved in service management. This results in fragmented information and knowledge processes around service delivery.

A new approach centered on incorporating service lifecycle management (SLM) as an integral part of product lifecycle management (PLM) is required in order to to achieve significant improvement in service readiness and delivery. First, this approach focuses on making complex products easier and less costly to maintain, and allowing for more effective allocation of service resources. The second key component is managing the complexity of service information that will reduce the cost and time to create and deliver critical service documentation and service records, at the same time improving the quality and efficacy of this information.

With SLM approached as an extended PLM process, design information can be used to bootstrap and enhance service planning, and product changes and updates are directed to modify service work instructions, and field experience provides up-to-date insight into product quality. The bulk of the information required for services such as illustrations, schematics, and work instructions already exists within the engineering organization and can be repurposed with a relatively little effort. 3D CAD models and Bills of Materials can be used to create everything from exploded wireframe views to photorealistic rendering, and to remove and replace animations that help in service execution. Manufacturability and ergonomic simulations can be used to improve the safety and efficiency of repair procedures.

The expanded PLM system needs to act as a centralized repository of the service bill-of-materials (sBoM) along with Engineering & Manufacturing BoM so that service items, which are mostly design and manufacturing items repurposed for service, can be synchronized to reflect the most up-to-date state of information. This synchronization is possible when SLM is part of PLM and shares the same configuration and change management processes

This way, enterprise PLM systems become the digital backbone of the entire product life cycle – including  SLM – and SLM becomes a dynamic process connected with PLM that continues throughout the useful life of the product or asset. This reduces process fragmentation and provides rich end-to end context for better and more profitable service.

The combined PLM and SLM approach, along with new service models based on the latest technologies (such as the Internet of Things), enables brand owners to deliver higher quality service at lower cost, resulting in higher profit margins, enhanced brand image, and greater customer loyalty. Product or asset owners who are the end customers also benefit from increased utilization and productivity due to faster and more reliable service.

What do you think? Is your organization connected?

Today’s topic will focus a little on the licensing side of CATIA – namely CAT3DX and the theory of what it is here for.

Several years ago, Dassault changed the way they were packaging CATIA V5 by introducing PLM Express as a way to buy it; my colleague Jason Johnson explained this in a previous post. As he had mentioned, this was referred to as CATIA TEAM PLM and was really designed to allow for connecting current CATIA V5 users of their new PLM offering, which was ENOVIA SmarTeam.  He also wrote briefly about the configurations and trigrams that make up the software.  The easiest way to think about a trigram per se is to know that a group of trigrams make up a configuration, and trigrams by themselves give you access to particular workbenches – or in some cases only add toolbars to existing workbenches.

Why does this matter? Because there is a new sheriff in town called 3DEXPERIENCE. Much more than a PLM system, the 3DEXPERIENCE platform suite of tools will assist the user in management of their daily work, projects, processes, BOMs, documents, CAD, etc.  While an old CAC (CAT) license – which was the base configuration for PLM Express – would give you access to SmarTeam by bundling in TDM and TEM trigrams, the new CAT3Dx will now give you all of that, as well as access to the ENOVIA 3DEXPERIENCE Platform, by giving you the PCS and CNV trigrams as well. These are the minimum trigrams needed to connect to the platform (the price for admission).

The Dassault idea is still the same – help CATIA v5 users move away from file-based, directory-based storage (which has always presented its own challenges) and help companies regain control of their data via the new platform. The only caveat to this is that you would install ENOVIA to manage your data, which is not as simple as throwing in a set of discs like SmarTeam was. ENOVIA requires the setting up of a database using SQL or Oracle, and then configuration of the various pieces (web server, authentication, java virtual machine, etc.).  Once this has been configured, the base PCS, CNV combination gives you the ability to vault your data and set up workspaces for where and how it will be stored, as well as do some level of change management on it. (set up Change Requests and Routes for how your data will be routed) to get it through its life cycle to release.

Creation Menu

 

The ENOVIA applications that come with the PCS, CNV combination are Classify & Reuse, Collaboration & Approvals, Collaborative Lifecycle, Design IP Classification, Exchanges Management, My Collections, My Issues, My Route Tasks, and X-CAD Design. These are plenty enough to help your team begin to get to a single source of truth – meaning, never having to guess what state the latest data is in.

ENOVIA Apps

You also have access applications for business intelligence information. This includes access to the latest technology of Dashboards.  Dashboards are ways of viewing data configured to your liking.  Not at all unlike the old igoogle portal which allowed you to customize your view of news, etc. In 2012 Dassault acquired Netvibes.

netvibes

Information Intelligence

[…]

Autodesk Vault uses the concept of a “Local Workspace” whenever files are opened or checked out.  Essentially, whenever a Vault file is accessed, a copy is cached in the workspace on the user’s local workstation.  From a user perspective, the workspace can be ignored for much regular work.  There are several benefits of a local workspace.

  1. get-to-workspacePerformance improvement over network share – One of the problems without a PDM system is that files are opened directly across the network.  Files being accessed and edited are located on a network share, and stay there while being worked on.  In environments with multiple users working with large datasets, this can become a disaster.  When files are checked out from Vault, they are cached locally and the workstation’s drives are able to respond to changes more quickly than a network server.
  2. Offline workflows – The local workspace also allows users to retrieve data to work on while disconnected from their corporate network.  The local workspace actually acts much like a briefcase:  The user simply checks out files, disconnects from the network and works on them, and checks them back in when they return to the network and are logged back into Vault.
  3. Better distributed workforce management – For companies with distributed workforces, the local workspace can also be a big benefit.  Combining the performance and offline workflows really makes workflows possible with a distributed workforce.  All that is required is a remote VPN connection, and then files can be checked in and out of Vault.  The VPN doesn’t have to be permanently connected.  When disconnected, it will really be just like an offline workflow.  Since files that are checked out from Vault reside locally, the distributed users still have good performance while editing and saving their work.

 

This is a second look at the hidden intelligence of CATIA V5. Our topic today will focus on the creation and use of design tables. As I talked about in my last blog post, parameters and formulas can be used to drive your design from the specification tree based on your design intent. We will continue on using the rectangular tubing part and build several variations of that tubing that can be driven from a spreadsheet.

Design Table Icon

Most of the work has been already done, and although it is not necessary to have pre-defined parameters and formulas existing, the process is faster. We will begin by again looking at the Knowledge toolbar, this time focusing on the Design Table icon.

When the command is selected, a dialog appears asking for the name of the design table and also gives you a choice on whether or not you want to use a pre existing file or create one from the current parameter values.  The differences being whether or not you have an existing spreadsheet filled out already with all the tabulated values of what changes in each iteration of the design.

Design Table Dialog

 

In our case, to show the functionality we will choose the create with current parameter values option. Once that is decided, you choose which parameters you want to be driven by the spreadsheet.  In our case, we had some already created, so we changed the filter to User parameters, chose the values that were NOT driven by formulas (INSIDE and OUTSIDE RADII) and moved them to the inserted side by highlighting and clicking the arrow.

Parameters to Insert

At this point, we have defined that we want a spreadsheet to use columns for Height, Width, and Wall Thickness based on the current values in the model as it is at this moment. When we click OK on the dialog, it will ask us where we want to save the spreadsheet. I suggest that you do this in a place where anyone who uses the model can has at least read access to (i.e. a network drive).  Note that I can also change the type of file to a .txt if I do not have access to Excel® or any other software that can edit .xls files.

Read Access Directory

 

Once this has been defined, your design table is created, linked to your 3D model, and ready to be edited to include your alternate sizes. This is confirmed by the next dialog. To add in the other sizes, simply click on the Edit table… button and your editor (Excel or Notepad) should launch and simply fill in rows with your values.

Linked and ready to edit

Once you have edited and saved the values, you can close that software and CATIA will update based on your values.

Excel Modifications

 

CATIA Updated

Now you would just pick the value set you want and click OK for the change to appear on the screen.

File Updated

At any time, you can always go to make the changes by finding the Design Table under the Relations section of the specification tree and double-clicking on it.

Design Table under Relations

As you can see, it’s pretty easy to create a design table and drive your parametric file with multiple values. The world of CATIA V5 is all about re-use of data and capturing business intelligence we already know exists in all companies.  How can we help you? Tata Technologies has helped many companies time and again.

Stay tuned for Part 3!

 

 

 

 

 

 

This post was originally created on December 8, 2016.

With all the buzz about Additive Manufacturing, or 3D Printing, in the manufacturing world today, there is a lot of mystery and confusion surrounding common practices and techniques. This week’s blog post will address a common type of 3D printing known as Fused Deposition Modeling (FDM).

But first, What is Additive Manufacturing?

Additive manufacturing is the process of creating a part by laying down a series of successive cross-sections (a 2D “sliced” section of a part). It came into the manufacturing world about 35 years ago in the early 1980s, and was adapted more widely later in the decade. Another common term used to describe additive manufacturing is 3D Printing – a term which originally referred to a specific process, but is now used to describe all similar technologies.

Now that we’ve covered the basics of 3D Printing, What is Fused Deposition Modeling?

It is actually part of a broader category, commonly referred to as a Filament Extrusion Techniques.  Filament extrusion techniques all utilize a thin filament or wire of material. The material, typically a thermoplastic polymer, is forced through a heating element, and is extruded out in 2D cross section on a platform. The platform is lowered and the process is repeated until a part is completed. In most commercial machines, and higher-end consumer grade machines, the build area is typically kept at an elevated temperature to prevent part defects (more on this later). The most common form, and the first technology of this type to be developed, is FDM.

The Fused Deposition Modeling Technique was developed by S. Scott Crump, co-founder of Stratasys, Ltd. in the late 1980s. The technology was then patented in 1989. The patent for FDM expired in the early 2000s. This helped to give rise to the Maker movement, by allowing other companies to commercialize the technology.

It should also be noted that Fused Deposition Modeling is also known as Fused Filament Fabrication, or FFF. This term was coined by the Reprap community, because Stratasys has a trademark on Fused Deposition Modeling.

What Are the Advantages of this Process? […]

© Tata Technologies 2009-2015. All rights reserved.