Author Archives: Anna Brindley
E-Spec’s Adobe Extensions are now available for CC2015
E-Spec has released updated versions of all of our Adobe extensions – now compatible with CC2015.
Convert-It! for Adobe Illustrator
- Provides alternate file formats; JPG, PNG, PSD and PDF
- Separates assets contained on artboards or layer/sub-layers into individual files
- Similar to Photoshop “export asset” feature
Tag-It! for Adobe Illustrator
- Collects and embeds XMP metadata; standard or custom fields
- Validates metadata with “pick lists”
- Some metadata can be required
- Metadata is mapped to your business systems
Route-It! for Adobe Illustrator
- Sends images and metadata to business system API’s
- Configured with authorization
- No user interaction required
In-Cat! for Adobe InDesign
- Automate creation of InDesign publications
- Templates are mapped to your database
- Search for items, select template and create a publication with a single click
- Later update all images and data with another single click
For more information visit our website: www.e-spec.net or contact us at sales@e-spec.net
Images Drive Your Business
Images drive your business. A design sketch initiates a new product. Technical details are captured in images used to tell a vendor how to build the product. Photos market the new product to consumers. Logos identify the product as yours. When problems arise in manufacturing, pictures document the defects. Sales and inventory reports include images as a reference. Customer service agents use pictures to ensure they are talking about the correct item. Websites require multiple images of the same product. Store signage must have large versions of the pictures. Marketing needs pictures of the product being used. If your company creates products, images are involved in every aspect of your business.Most companies treat images as after thoughts – something that must accompany their data. Images are lumped together on departmental systems; each department maintaining their own copies and trying to ensure they have the latest and correct version of the image. When enterprise software systems are implemented, they may capture images and include them as part of “one version of the truth”, but the truth is: not every one uses the enterprise software – but everyone uses images.
Linear Workflow Ends in DAM
Previously, the product development department would hand products over to marketing, sales and production in a linear fashion. In today’s world speed to market requires all these tasks to be performed in parallel; the hand-offs are not done all at once. They are iterative. There are plenty of solutions being offered to help automate this complicated workflow and communications, but they still treat images as overhead and attachments.
In the linear workflow, the Digital Asset Management (DAM) system is at the end of the process. Assets are stored when they are complete; the DAM is more of a repository or archive.
Make DAM the Hub
In the world of today, the DAM system needs to be involved up front. DAM is the hub of your business, enabling the images to drive the process.
How?
- Define common attributes – enterprise taxonomy and vocabulary The first step towards implementing an enterprise approach to integration is to define the terminology used by each department and their business systems. This exercise includes mapping of the attributes between systems and documenting the allowed values. Relationships between the attributes should also be captured, such as “one-to-many” and “many-to-one”.
- Use embedded metadata – standard and custom Adobe’s XMP is the industry standard for formatting metadata. It is used in JPGs created by most cameras and previous metadata standards have adopted XMP versions. The current standards contain fields for common elements (especially for photography and publishing) but the power of XMP is the ability to define your own custom schema and tags. Custom fields are used to store your taxonomy and vocabulary.
- Use system’s API’s (RESTful and SOAP if you haven’t upgraded in a while) It used to be common for business systems to have an “open” database. This allowed integration to be accomplished at the database level. Today it is more common for the vendor to supply Application Programming Interfaces (APIs). These API’s are now adopting industry standard so integration can be performed more easily and in a more generic fashion. The first standard to become popular in recent years was SOAP but this has rapidly been replaced by RESTful APIs. In both cases the syntax is similar to XML (as in the XMP standard), allowing straightforward implementations.
- Collect metadata early at origin To make this integration support your workflow, metadata needs to be collected as early as possible; collect all the attributes that are known at creation and add new ones as soon as they can be identified. Make the data collection as easy and non-obtrusive as possible – the fewer keystrokes the better.
- Define common metadata with subsets for user groups/departments As you collect metadata, you need to keep the dialogs as uncluttered as possible. You want to provide the user with only the attribute fields they are concerned with. The file may have many more fields embedded, but the user doesn’t need to see or deal with all of them. Typically you can divide the attributes up by user group or workflow task. This makes user adoption much easier.
An Ideal Scenario
A designer sketches a product design in Adobe Illustrator. Metadata is collected: season, collection, gender, design number, designer’s name and other attributes. The metadata is embedded in the Illustrator file. A JPG version of the file is created and sent to the PLM system along with the metadata. If the metadata points to an existing record, the image is added to that record; if the record does not exist, it is created and the image is added. At the same time a PNG (with transparent background) is sent to an internal website used for cross-departmental communications. The original AI file is cataloged in the DAM system.
The product development advances to the point where a sample is requested from a manufacturing vendor. When the sample arrives, it is sent to the photo studio for photos. As the photos are being processed, metadata is added to the files (sample number, season, collection, gender, design number, designer’s name and other attributes). The photos are cataloged by the DAM system. It is now possible for systems to use the metadata from their sketch files (JPGs or PNGs) to query the DAM system and retrieve the photos.
As the PLM and ERP systems exchange data, records in the ERP now also match the metadata. The internal website can provide data from a group of styles from multiple systems. By selecting the Season/Collection images, queries can return pricing data from one system, color and size data from another, and sales statistics from yet another system.
A customer service representative can use the image to verify that they are referencing the correct product and then use the metadata to access data from the other systems to respond to the customer’s inquiry. This same functionality can be built in to a “self-service” customer service website.
The Key
The key is to create embedded metadata that combines to build indexes of unique records in every business system. Self-aware images and metadata drive your business workflow and processes.
This article originally appeared in WhichPLM? http://www.whichplm.com/editors-choice/images-drive-your-business.html
Generic Tools–The Key to Simple Integration
After my last article, “PLM Integrations with Adobe Illustrator – Keep It Simple”, I have been asked to expound on the correct simple approach. Our experience is that when the integration is too specific or customized to one particular business system, it limits the user’s flexibility when providing similar content to other destinations. Creative users do not live in a single system vacuum; their content can be used and re-purposed by many other people and systems within the enterprise.Our integration approach is based on:
- Industry standard metadata
- Industry standard API’s (usually RESTful but previously SOAP)
- Separating multiple assets from a single file
- Tools to configure/map the metadata and API’s without programming
- Reusable tools – not customized per system, customer or particular installation
First – Identify Attributes
The first step is to identify the attributes that define a unique item (product, asset, object, style, etc.). In most of our customers, this is Style # and season; sometimes Style #, color, season, size or size range. The combination of these values will identify a unique record in all systems across the enterprise. Associating an image to these values enables the image to be linked to multiple systems.
Other attributes also need to be identified; the taxonomy and vocabulary of the enterprise. This is an exercise in determining the proper nomenclature, knowing what each system calls every attribute. This will allow information to be exchanged between systems.
Creating Metadata Fields
We provide our customers with a tool to create the metadata fields. We use Adobe’s XMP metadata standard to create custom XMP fields; these can be text, memo (multi-line text), date, combo boxes (pick lists) and check box fields. The field definitions look very similar to XML.
We provide a series of tools that can be configured by the customer to match their requirements. Our metadata collection tools are configured per user/user group. Each configuration is designed to minimize keystrokes for the user. Only the fields the user is concerned with are included. Where possible, default values are assigned to reduce data entry. We collect the metadata as the image is created, embedding the unique identifying data inside the image file. Additional descriptive or informational metadata can also be collected and saved in the file.
Separate Multiple Images
An additional tool is used to separate the multiple images that are contained in a single file. These images are typically separated by layer/sub-layer or by artboards. These derived files will also have the same metadata embedded in them as the original file has.
For systems that “catalog” files (like Digital Asset Management systems), this is enough as these systems will read the metadata and import it with the image into the system. For other systems we provide tools that will write the metadata and the image (or image path) directly to their database. If the database is proprietary or encrypted, we provide tools that will communicate with the system’s API’s to allow the metadata and the image to be imported into the system.
To help automate the integration, “target” locations (folders) are used. Most companies already have something similar in place; users have their individual work areas to store their working files and when the files are ready to enter the workflow, they are saved to shared locations so others can access the files. These shared locations become our “targets”. When a file is saved (or updated) in a target destination, the metadata dialog is displayed and all required fields (typically the unique attributes) must be provided. The file is separated into its individual images and the files routed as defined in the configuration.
Unique Approaches by Companies
Different companies use these tools to solve their unique integration challenges in different ways. One company uses this approach to tie their Illustrator files to the BOM maintained in their ERP system. Another separates the images into both JPG and PNG files; the JPGs are sent to their PLM system while the PNGs are used in their website (transparent PNGs allow the image to maintain the background of the website). The JPGs are later provided to their ERP, sourcing system and data warehouse. With each image, the metadata is used to create and update records in each system. Since the data is “sync’d” between systems, it is possible to query related data across multiple systems.
Another customer implements this approach to integrate Illustrator files to their Digital Asset Management (DAM) system. The DAM acts as an integration hub feeding data and images to all their other business systems. Any changes made to the images or metadata are fed downstream via the DAM.
By using generic tools configured with industry standards, this approach has provided integration with dozens of DAM, PLM, ERP and other business systems. This approach is not limited to just Adobe Illustrator files as many file types now support custom XMP metadata; even files that don’t support XMP can participate using “sidecar” XML files. Even image files can be related to each other; if the design sketch has the same identifying metadata as a photo, a sample’s photo can be used in a report replacing the previous sketch. The opposite can also happen; a customer service representative can access the design specs for a product by using the metadata contained in the catalog’s photo of the product.
When the data is collected at its origin and passed along with the image, the image becomes the driver for the workflow through the enterprise.
This article originally appeared in WhichPLM: http://www.whichplm.com/editors-choice/generic-tools-the-key-to-simple-integration.html
Automate your publications. Use In-Cat! for Adobe InDesign
PLM Integrations with Adobe Illustrator – Keep It Simple!
Too Tight
In 2005 a client asked us to provide an integration between Adobe Illustrator and a PDM (Product Data Management) system. The integration was to solve two issues:
- Provide a alternate file format that the PDM system’s UI could display;
- Reduce the number of keystrokes required to get the Illustrator file imported into the PDM system.
For the last 10 years we have been a strong proponent of helping “creative users” provide their images to the required business systems in the most efficient method. Most PLM (Product Lifecycle Management) vendors have embraced this as a way to gain system acceptance among the creative user community. What we have seen develop recently is the advent of “tight” integrations; integration so closely coupled with a particular system that the creative users become burdened with data entry. The integration causes the exact issues that they were meant to resolve.
A recent integration I saw was meant to make the importing of images into a PLM system easier by letting the user remain in Illustrator. The users were not required to open the PLM UI in order to upload images. This was typical years ago when workstations did not have enough RAM to have both applications open at the same time. This caused the user to close Illustrator, open the PLM application and then close PLM in order to return to Illustrator and their next image.
With today’s more powerful hardware this is not the case, in fact many users now have two monitors allowing the user to easily operate both Illustrator and PLM at the same time. Simply staying in Illustrator should not be the objective; reducing keystrokes and steps to make the user more efficient is the objective.
In this particular integration the user was required to do the following in order to integrate their images with the PLM system:
- Open integration plug-in from the menu
- Use a dialog to logon to the PLM system (Enter system name, username and password)
- Enter information (metadata) in another dialog (Division; Manager; Style #; Description; Project Parent; Master Product; Template)
- Navigate a tree structure for each artboard to “place” the images
- Navigate a tree structure for where the image database record is located
- Submit
Assumptions Made
How is this easier than having both systems open on the same workstation? The vendor essentially recreated the PLM system’s UI within a plug-in in Illustrator.
This integration also assumes that the multiple images contained in the single Illustrator file exist on individual artboards. This is not always the case; there can be multiple images on the same artboard. The image may be needed with and without the text callouts. Multiple colorways may exist on a single artboard but the PLM system will need each colorway as a separate image. This integration made the user responsible for replicating images across multiple artboards – dictating how the user organizes their Illustrator content.
Adding Unnecessary Steps
By providing such a “tight” integration, there is an assumption that the PLM system is the only destination for these images. They have “locked” the Illustrator output within a single system. The creative users are also responsible to provide images to many other destinations; websites, e-commerce systems, marketing departments, sourcing systems, even labels and retail signage. All of these steps will require additional steps/clicks by the user. This integration is counter productive as the users will now see it as adding effort, and not reducing it.
Also, this type of “tight” integration doesn’t allow for varying roles among different sized companies. In small companies, one creative user may be responsible for all data relevant to the image. In larger companies teams of creative users will have artists responsible for the graphics, and designers responsible for the product incorporating the graphics. The data and workflow for these users is different. Tight integrations are not flexible enough to make everyone’s job easier.
Doesn’t Fit the Workflow
Other “tight” integrations we have seen allow the user to create a complete BOM while in Illustrator. Again, this is replicating the PLM system’s UI for their raw materials database within Illustrator. The keystrokes are almost identical. The advantage of not opening the actual PLM application is offset by the rigidity imposed on the workflow. When a sketch is ready for sharing with PLM users, it is not yet time for a complete BOM. All that is required is a family of fabrics and some suggested colors from the seasonal palette. The exact thread, the specific fabric and the packaging information are not required to get an image in PLM. In the seasonal workflow of a designer, many sketches and images are generated early in the season; only some will reach the point in the workflow where a BOM will be required. Many times it is a design assistant or a tech designer who completes the BOM; the person might not even open Illustrator. Tight integrations make assumptions about the duties of the users and the product’s workflow that may not fit the particular implementation.
It seems that “tight” integrations also make the assumption that every consumer of the image needs access to the original editable version of the image. In reality not all users need or want the large original file at their disposal. For previous architectures this may not have caused problems but as more and more applications become cloud based, providing everyone with the large original image is unnecessary overhead. Some users and systems only require an adequate representation of the image in order to view, comment, review, include on a report and possibly print. If a change is needed, in most cases the original author of the image is asked to make the change.
Limiting the image integration to Illustrator is also a problem. The same users will have images that do not originate in Illustrator. Forcing the user to import those images into Illustrator in order to use the integration is counter-productive. JPGs from cameras, output from Photoshop and Light Room as well as other digital applications must all be easily integrated with the business systems.
Keep it Simple, Silly
A better solution is to provide a more flexible “generic” integration; one that focuses on the image or asset rather than the system’s data. Use of standard and custom metadata to drive the integration makes the image the link between users and data. It enables multiple systems to access the same images. It allows representative versions to be “tied” to the original file. It allows the image to support the desired workflow rather than having an integration dictate your business processes.
PLM users get very frustrated with some of the Illustrator integrations out there. The most common reaction I have observed is that the user stalls as long as possible before submitting the image to the PLM system. Designers have used AI files, Excel spreadsheets and e-mail to get their samples made and then, only after sample approval, do they use the Illustrator integration to place the image in the PLM system. They then just backfill the required data and the PLM system misses an entire step in the workflow. That can’t be the intention of the vendor’s integration.
Things have changed in the last 10 years and are changing at a faster rate. It’s time for a new approach on how to use images in the enterprise. One thing is for sure: we need to try to keep it simple.
Apparel, Retail and Consumer Goods: PLM Selection Considerations
PLM vendors for the Apparel, Retail and Consumer Goods market segments can be classified based on their origin. The selection of the right PLM vendor for your company, in part, may depend on matching your company’s culture with the vendor’s background. The PLM vendors’ backgrounds fall into 4 categories.
1. Engineered Products
Several of the larger PLM vendors have their roots in highly engineered products. They started in aviation, electronics and manufacturing industries. Their legacy applications were typically UNIX-based and provided integration with CAD/CAM systems. Several can be traced back to a common Windchill development environment. These systems were typically focused on configuration control and tracking. (The QA approval for a particular lot of rivets must be linked to the exact part they were used on and the particular aircraft where the part is installed.) Originally the products had long lifecycles but as these vendors moved to electronics, lifecycles became shorter. Dealing with shorter lifecycles positioned these vendors to move into our target market. The arrival of browser-based graphical user interfaces and the technology move from UNIX to PC workstations (Windows and Macintosh) set the entry into the Apparel, Retail and Consumer Goods markets.
2. Extension of a CAD System
Another set of vendors entered the market via pattern making systems. These companies had pattern systems in place at apparel customers and creating an application to generate “tech packs” was a logical extension to the pattern systems. The early versions of these products were PDM systems rather than PLM, but again the technology changes allowed these vendors to move from PDM to PLM as they moved to browser-based systems. These applications tend to be focused on fit, sampling, BOM and costing functionality, as these functions were core to their PDM roots.
3. ERP add on
A third group of vendors were supplying ERP and/or sourcing applications to the Apparel, Retail and Consumer Goods market. As they modernized the systems’ databases and also implemented browser- based interfaces, it was a logical step to increase their user base by implementing PDM/PLM functionality. As you might expect, these systems tend to focus less on design and more on merchandising, logistics and delivery of the products.
4. Started in the Apparel / Retail / Consumer Goods space
The last group of vendors started as Apparel, Retail and Consumer Goods PLM systems; the majority were started by those who worked for, or with, other PLM systems. They saw missing functionality or new technology not yet embraced by the other vendors. They had the advantage of starting from scratch but also the disadvantage of having to build an entire application along with an infrastructure for their support and sales staff. These vendors’ applications are usually based on a development environment (i.e. Microsoft Dynamics), a particular database technology or a “cloud-based” implementation. Since they started from scratch implementing the latest technology is easier than a vendor with legacy code to support.
While the origin of a vendor is not the critical factor in a PLM system selection, it can be useful to help determine if the vendor is a fit for your organization. If you have highly engineered products (outdoor apparel or lingerie for example) then a vendor from an engineering tilt may be an advantage. If you have a large diverse user base, an established vendor with an extensive support infrastructure may jump to the front of the line. If your IT department consists “early adopters” then a start-up may be the best fit.
The philosophy or approach of your company is another consideration in the PLM vendor selection. Some companies want each department to have the best in class tools for their function. This approach means integrating multiple departmental applications into a single “PLM” system.
If your organization is a bottom-up business (departmental decision-makers) then vendors whose applications are modular may get the advantage. Modular systems allow for more flexibility in integrating additional tools with the PLM system. It may go as far as buying components from multiple PLM vendors to create a mixed breed PLM system.
Top-down organizations tend to like to work with fewer vendors. They are likely to pick a PLM system from an existing vendor – their ERP vendor or their pattern system vendor for instance. They will use “adequate” tools in some areas in order to get support from a single vendor.
Your organization’s acceptance of process change is another consideration when selecting a PLM vendor. If your business processes are defined and in place then your PLM system will need to mirror your processes. If one of the reasons to implement PLM is to define new business processes (or re-define existing ones), then your new business processes can mirror the PLM system. In either case the ability of the PLM system to support modifications is key. Some PLM systems are “hard-coded” and require custom programming to alter how they are implemented. Other systems are “configurable” allowing administrators to setup or change how data is represented in the system. Of these configurable systems, there are two approaches. In one, administrator dialogs are used by the system to present configuration options. In the second, XML control files are used to modify how the system works. In the latter case it often becomes more like customization programming than “configuration”.
If your organization is set in its ways, finding a vendor who can configure their PLM system to your processes will be beneficial. If your organization is flexible and willing to change, finding a vendor with an OOTB (Out of the Box) version of its PLM system will make the implementation go much faster.
The last consideration I want to put on the table for discussion involves system and data integration. In every PLM implementation I have been involved with system and data integration was a requirement at some level. PLM vendors have taken two approaches; one, provide an open database model allowing for easy data access for integration; two, provide a proprietary database that requires the use of APIs (Application Programming Interfaces) to access your PLM data. Open database solutions tend to fit better with companies who have internal IT resources to perform their integration. The use of APIs tends to require the PLM vendor (or an approved contractor) to perform the integration. Both methods can be successful; again. it depends on the resources and priorities of your organization.
I hope this article highlights the important considerations in selecting a PLM vendor/system that is not on your typical RFQ system functionality checklists.
Adobe Releases CC2015; Why are you still using CS6?
Licensing: What has changed is the licensing and payment strategy. Adobe now uses a subscription-based model for licensing; you “rent” the application by the month (with a discount for paying for a year at a time). Applications available individually or “bundled” so you don’t necessarily have to subscribe to the entire suite. Previously you purchased a license for the application and if you included support, you could get free updates to fix bugs. This model also included bundling of the application or purchasing individual applications. Under the purchased license strategy you were required to buy upgrades for each major release; typically at least every two years. In my opinion, over a period of several years the costs to use the software is close to the same. The advantage to the consumer is Adobe’s Creative Cloud now includes many useful features that were not available under the old licensing model.
Budgets: When Creative Cloud was initially released, I heard several arguments regarding “double dipping” – “we have already paid for CS6, why should we begin paying a monthly subscription too?” This may have had some validity during the first rollout as Adobe didn’t provide any “credit” for existing licenses (like they do for an upgrade). Now that we are on the third release of CC, you have gotten your money’s worth from your CS6 purchase. The other argument regarded “capital” vs. “expense” budgets; purchases are regarded as capital expenditure and subscriptions come out of the expense budget. I have found that using the subscription model, many companies can actually save money. One example is the use of contractors during peak times; the contractors can be assigned a license on a monthly basis rather than consuming a purchased license.
Software updates: Initially IT departments had a major valid concern regarding updates on the applications. Creative Cloud initially allowed each user to decide how and when to apply application updates. This could lead to support issues when users are on different versions of the applications. Adobe released Creative Cloud for Teams (targeted for up to 150 users) and Creative Cloud for Enterprise (over 150 users) that allows an administrator to control the installation and update processes. It also allows for easy transfer of licenses among users. With Team and Enterprise version, IT actually has more options and control than under the CS6 versions.
Integration support: Another concern is the support for integrations between the Adobe applications and other business systems and processes. From experience, I am aware of the burden Adobe has placed on integration providers. Prior to Creative Cloud Adobe released major releases on a pre-defined basis (yearly or longer) with only minor updates in between. Now the release cycle has accelerated and major features can be added at almost any time. This presents a challenge to vendors who provide integration products and services. The vendor must constantly monitor the pre-release programs so they can prepare for the updates. Even with participation in the pre-release, testing must be completed on the final Adobe release prior to providing updates to the integration. With CC2015, the final release was provided with the public announcement, requiring a delay of integration support. If your vendor is not keeping up, alternative integration strategies may be required.
- File storage options
While your files can be stored in Adobe’s Creative Cloud storage, you can still save your files in your current storage locations. You can take advantage of cloud storage when it makes sense while still keeping your important IP files in your secure servers.
- Shared assets
You can create libraries of assets that you can share with others. Share with colleagues or vendors depending on the asset. Logos, colors, fabric swatches, construction details and almost any reusable asset can be organized into libraries.
- Mobility and multiple devices
Use your libraries and cloud storage to make your files and assets available at work, home and while traveling. You can sync your settings so your workspaces are identical whether at home, work or mobile.
- Access to additional applications
Since you can access application “by the month”, you can occasionally use additional applications as the need rises. In the Creative Suite you purchased a set bundle of applications; Creative Cloud has more “bundled” sets of applications as well as individual applications available.
- Licensing flexibility
With Team and Enterprise editions, you can re-assign licenses among users as required. You can add licenses for just a single month or pay for them for a year and move them among departments and divisions as their workload varies with their season. Freelancers and contractors can easily be accommodated without buying a copy of the application, just rent it.
If you are still using CS6, you owe it to yourself and your organization to explore Adobe’s Creative Cloud 2015.
*Disclosure: I am not an Adobe employee or representative and these comments are not endorsed by Adobe. E-Spec, where I am President and founder, is an Adobe Silver Solution Partner.
Extensis and E-Spec Partner to Integrate Extensis Portfolio Digital Asset Management System with E-Spec Adobe Integration Tools
Integration enables customers to auto-populate Adobe InDesign templates with images and metadata from Extensis Portfolio
Portland, Ore.―July 7, 2015―Extensis®, a global provider of digital asset management (DAM) solutions, announced today it has partnered with E-Spec, digital image workflow specialists who connect Adobe® Creative Cloud® applications to key business systems. Using the Extensis Portfolio™ open API, E-Spec has integrated the latest version of Extensis Portfolio with its suite of Adobe integration tools so customers can auto-populate their Adobe InDesign® templates with images and metadata.
E-Spec’s solutions are used to create and maintain published product offerings, such as look books and catalogs. Customers use E-Spec’s Adobe Integration tools to connect their Adobe InDesign templates to business systems containing relevant content – such as images and product details – so that the information is auto-populated and kept up to date with a single click of a button. This eliminates manual processes, reducing what once took weeks to days, and eliminating costly errors.
By integrating the latest version of Extensis Portfolio into E-Spec’s suite of tools, customers can more effectively manage their digital assets in a centralized location and then easily pull this information into their Adobe InDesign templates.
“E-Spec’s tools are widely used by companies in the fashion, retail and consumer goods industries who need to publish their product offerings with frequency,” said Toby Martin, Vice President of Development and Strategy at Extensis. “These companies have a huge repository of digital content, so linking in Extensis Portfolio offers a powerful solution for both managing this content and creating a direct connection into their creative workflow via E-Spec.”
“Effective digital asset management is critical to any company that has a large collection of digital content,” said Dan Hudson, President and Founder of E-Spec. “This is especially true in the retail industry where we see customers with thousands if not millions of assets. Many are using Portfolio to manage their digital assets, so we’re excited to partner with Extensis to deliver an integrated solution.”
Customers interested in learning more about Extensis and e-Spec’s integration solution can contact Extensis by clicking here, and E-Spec by clicking here.
About E-Spec, Inc.
E-Spec is an Adobe® Solutions partner, which specializes in solutions for integrating the Adobe Creative Cloud® with existing business systems and processes. From the creation of an image, we embed XMP data that flows through your business systems. This creates a digital image workflow through the combination of image files and related data; allowing images to flow to all systems instead of remaining locked inside a single database or location. Our suite of products and services utilize Adobe® technology to embed data within the image file; Illustrator, Photoshop, InDesign, JPGs, PNGs and PDFs; so the files become self-aware, allowing the image to drive your business.
E-Spec is a Silver Adobe® Solutions Partner. To learn more, visit www.e-spec.net or follow us on Twitter @EspecInc.
About Extensis
Celebrating more than two decades in business, Extensis® is a leading developer of software and services for creative professionals and workgroups. Their solutions streamline workflows, securely manage digital assets and fonts, and control corporate typographic branding. Used by hundreds of Fortune 5000 companies, Extensis’ award-winning server and desktop products include: Portfolio® for digital asset management, Universal Type Server® for server-based font management, and Suitcase Fusion® for single-user font management. Founded in 1993, Extensis is based in Portland, Oregon, and the United Kingdom. To learn more, visit http://www.extensis.com or follow us on Twitter @extensis.
© Celartem, Inc. d.b.a. Extensis All rights reserved. Extensis and the Extensis logo mark, Suitcase Fusion, Portfolio Server, Portfolio NetPublish, Portfolio Flow and Universal Type Server are trademarks or registered trademarks of Extensis in the United States of America, Canada, the European Union and/or other countries. This list of trademarks is not exhaustive. Other trademarks, registered trademarks, product names, company names, brands and service names mentioned herein are property of Extensis or other respective owners.
Success & PLM
Once upon a PLM
When I first started in the apparel industry, designers didn’t have computers on their desks. Tech Designers typically would share CAD/pattern system workstations. These workstations were not networked so file sharing was via floppy disks. There was usually a mini computer system with green screen terminals for the ERP system. When we were selling our PDM software, we were actually selling the company on the need to have PCs. Designers would carry to meetings manila folders, one for each of their styles. These folders contained CAD printouts, hand drawn sketches, fabric swatches, ERP printouts (folded computer paper with perforated edges), buttons, trims and maybe a floppy disk.
Our pitch followed the lines of, “wouldn’t it be nice if all of this information was stored in a single location that everyone could have access to and you wouldn’t have to worry who borrowed your folder or where you left it?” We usually had positive responses, but I will always remember one head designer at a large mid-west retailer who said:
“Why would I want to put all of this information into the computer when the style isn’t even in production yet?”
She owned the style until it went into production and she didn’t want to share control of her data. I don’t think I even tried to explain it to her. These were the same people telling me that software would never replace hand sketches.
“A PLM Success Story”
I was reminded of this story recently. I was visiting a customer who had just completed a “successful” PLM implementation (according to management and IT). I was discussing their image integration points with a Designer, asking at what point she uploaded her Illustrator files to PLM. Her answer seemed very late in the workflow to me: after the samples were approved. I inquired as to how the PLM system showed the image prior to sample approval.
“Oh, it’s too much trouble to put all that data into the system so we wait until we are sure the style will make the line meeting before entering it”.
I asked how the concept samples got produced.
“We send the Illustrator file, an Excel spreadsheet and a pattern reference # to the factory and they make the samples based on that”.
I have never viewed “successful PLM implementations” the same since.
What is a successful PLM implementation?
Typically a company pursuing a PLM system will have a champion, someone who pushes to get a PLM system purchased. This champion usually represents one or two departments who are having difficulty with the current systems in place. One measure of “success” is making these one or two departments happy. This may mean replacing an aging PDM or PLM system, or replacing a manual process of Illustrator and Excel files being e-mailed around. But this is not what most PLM vendors are pitching in their demos – it is not about just solving a particular departmental problem. PLM is positioned as an enterprise solution.
Sometimes an edict will come down from upper management about the need to implement PLM; usually with very little guidance on what the goals are. Objectives like shorter development calendars, increased SKU counts or other “bottom line” metrics are used to justify the purchase of PLM, but reaching these types of goals can be done with a poor PLM implementation (usually at the expense of employees’ time and sanity).
I have been asked to help maintain older PDM systems, years after they were “replaced” by successful PLM implementations. In most cases, it turns out that the company didn’t force all departments/brands to use the new system so the older system lingered for years – circumventing the PLM advantages along the way. A deprecated system usually must be used until the current production styles work their way through their lifecycle. It is expected for an older system to survive 9 months or so after the new PLM system is implemented – but 5 years is excessive. The reasons given are
- The new system is too complicated for our usage; or
- It doesn’t do “this function” the way we have to have it done.
In reality, management didn’t achieve buy-in from all parties or didn’t remove people who weren’t on-board.
Some successful implementations have been “halted” mid-rollout. They were declared a success even though not all divisions/departments were using the system. Some users were allowed to use or revert to Excel spreadsheets and e-mail. There was too large of an investment to admit failure so management, and investors all believe that PLM is adding to the bottom line when, in truth, they wasted their money.
In another case, a division (previously a company that was acquired) is allowed to continue to use its older PDM system rather than the new corporate PLM system. Not because the division didn’t see the value in the new PLM system, but rather that their old system was paid for but acquired constant additional costs; corporate would “bill” them for not only the cost of the new PLM licenses and maintenance, they would also add IT overhead to the monthly bill – even though the IT staff was off-site and its effort would be unchanged. While the PLM system might have saved more than the charges, it was a matter of control and politics that ruled their decision.
Product Lifecyle – you already have one
The issue is the very idea of success – PLM is not an application that is installed, turned on and maintained. It isn’t even necessarily a single application. Everyone has a product lifecycle management system; it may be completely manual or even paper based. If you produce products, you are managing them through a lifecycle (even if you aren’t aware of it). Automating your management of products and optimizing the workflows while instituting metrics (for continual process improvement) should be the measure of success. It is an unending road and tone that may lead through multiple PLM vendors along the way.
Which PLM?
Many times I have been asked which PLM software I think a particular company should purchase. My answer is always:
“It doesn’t really matter. I can make any of them work, but only if management has a commitment to change.”
It is not the software, it is about how you use the system as a team. If departmental barriers (budgets and bonuses) remain in the way, no software will make department heads cooperate. If “blame games” remain the status quo, how can computers help?
Ongoing and Evolving
If you view your PLM implementation as “complete”, then you don’t have a successful implementation. If you are not taking advantage of Adobe Creative Cloud or 3D modeling, your system is falling behind. PLM should be viewed as an on-going process improvement initiative; one that utilizes the latest technology improvements while continuing to adjust fast changing markets. If your PLM budget has an end date, you might need to rethink your approach.
This article originally appeared: http://www.whichplm.com/editors-choice/success-plm.html