Complex Hierarchies in 7.1

June 17, 2009

The new multiple main tables feature in MDM 7.1 opens up lots of opportunities. For example, modeling complex hierarchies — if your data model requires a hierarchy between different types of objects (for example, a hierarchy that has Markets –> Manufacturers –> Products) just use multiple main tables that have a Lookup [Main] field to connect each level to its parent (and children, if needed). Using main tables also means you can model high cardinality hierarchies.

Don’t forget to consider sizing issues and the fact that you’ll need some kind of custom UI to show the data in a tree-like structure (my 2 cents — I think a tree-like UI makes sense for a relatively small hierarchy, whereas an iTunes-like drill-down UI makes much more sense for larger volumes) — but these problems are solvable with a reasonable amount of effort.

I just started two projects that use 7.1, so I hope I’ll be writing regularly about new things you can (and possibly, can’t) do with this new, very promising, version.

Advertisements

Null=Null == NullNull

June 11, 2009

This is one of those things you never think about. I got a simple requirement from a customer: records are being checked-out when a workflow starts, users edit the record and at the end of the workflow MDM should check if some fields were changed since the record was checked-out and do something if they were. I needed a validation that returns True when certian fields in the record were changed, so I figured the following, very simple, validation would be enough:

[Original].Field = Field

I quickly learned it works, but only when Nulls are not involved. MDM doesn’t consider Null as blank, and the comparison doesn’t work as expected. This is the expression I came up with that works no matter what:

IF(IS_NULL([Original].Field),
IS_NOT_NULL(Field),
IF(IS_NULL(Field),
TRUE,
[Original].Field<>Field))

What it does is the following:

  • If [Original].Field is NULL, return the result of “IS_NOT_NULL(Field)”
  • If not: If [Field] is NULL, return TRUE
  • if not: Return [Original].Field<>Field

I get why MDM doesn’t consider Null as the same as nothing, but at the very least I think it would be nice to have a COMPARE function in the Expressions Editor that can compare two fields and consider Nulls as blank values.

Alon

Think UNIX — secure your SAP MDM!

March 2, 2009

I’m a UNIX person myself. Ever since my first IT employer told me I’d be using Linux as my development desktop I got hooked. Well, to be honest at first I got depressed since it took me too long to install it (back in those days it was much more difficult than it is now), but once the system was up I was hooked. Having a powerful command line shell and flexibility to tweak practically every part of the system felt great. I also liked the way the system kept itself safe. Unlike Windows, where the default user could launch a nuclear missile attack by accidentally clicking “OK” on a pop-up, in UNIX/Linux the default user could only use the system, not administer it. To do anything that could be harmful to the system you had to either have the right to do that task, or know how (and have the right) to perform tasks as the “root” user, the UNIX equivalent of an all-mighty godly presence.

Unfortunately, MDM takes the Windows path. When you create a repository, the Default role has all the security functions enabled, and the only existing user, Admin, can do everything. More often than not project teams simply either use the Admin user or create several users with the Default role, allowing everyone to do everything. In many cases this “tiny” bit of configuration is overlooked when moving from Development to QA and then to Production, and I don’t need to describe the different levels on which this is wrong.

However, even before moving away from Development some things should be changed. Roles can be used to make sure the project team doesn’t accidentally overwrite things like that Import Map you spent a week configuring by clicking “Save” instead of “Save As”, or those 2000 test records you spent some time perfecting by accidentally hitting Delete when all the records where selected and ignoring the “Are you sure….” pop-up. There’s a lot of little things to think about when designing security for an MDM project, and it’s probably not possible to create a fit-all list, but here’s a few pointers to help you get started:

During initial repository setup:

  • Change the Default role to be a read-only role.
  • Create a user called “Read Only” and give it read-only rights. Use this user whenever someone asks to see what you’ve done so far. This way you can rest assured that reviewers don’t mess things up.
  • Create roles named “Import Map Design” and “Syndication Map Design”. Give these roles the rights to create import and syndication maps but no overwritemaps. Less experienced team members should use these roles when working with Import Manager to avoid accidentally overwriting existing maps.
  • Create a role named “Syndication”. This role should not have any function related to Check In/Check Out. This makes sure you always syndicate the checked-in version of records. Use this role whenever you run a syndication and also configure Syndication Server to use it.

When moving from Development to QA/Production:

  • Most users should not have any Taxonomy privileges. There’s no point in letting anyone except the people who manage the taxonomy to create attributes, re-link them, or change them in any way.
  • Most users should not be allowed to perform mass deletions or updates, unless their position specifically requires that.
  • Make sure the Admin user has a password. (You’d be amazed how often this one is overlooked). Or better yet, change the name of the Admin user to something less common.
  • Make sure that only a select few users have access to the MDM Console application, and make sure, on the network level, that the firewall only allows their stations to communicate with the MDM server on the Console ports (20000 – 20005)
  • You built a Master Data Management system — make sure users are assigned to manage the different tables. There really is no point in letting everyone the right to manage all the tables. Most users should only have read-write privileges on the Main table, while others should only be allowed to administer the lookup table. Sit down with data owners and assign different parts of the repository to different roles and users to avoid getting your data dirty by users who think it’s best to “just add that value to the list for now until somone gets around to check if it’s the right one”

These are only guidelines. Extend them to support your project’s security requirements.

“Fax, Don’t Answer”

February 19, 2009

It was 12:30 in the afternoon, lunch rush-hour in the business district of Herzilya, dubbed by many “Israel’s Silicon Valley”. I walked into this Sushi restaurant about 15 minutes earlier and placed a take-away order. Since the place was crowded with all kinds of IT employees decadently having Sushi for lunch, I sat on a bar stool just beside the service area, where everything was happening — the waiters and waitresses were bustling in and out of the kitchen, the bar tenders were pouring drinks, and the shift manager was overlooking everyone and was generally looking worried. As if the eat-in crowd wasn’t enough, there were many orders coming in by phone for people to pick up. I noticed there were two phones on the wall — they were completely identical, the only difference was that one of them had a sticker on it. The sticker said: “Fax, Don’t Answer”. I even took a picture of it:

Fax, Don't Answer

Can you see the right-hand phone has a small white sticker on it? I noticed this because what happened was that every time a phone rang (any of the phones, they sound completely alike) people would reach for one of them, then they’d wait for the next ring to figure out which phone it came from, and then they’d think if they should or should not answer, because there really is no point in picking up the fax phone. I observed this for another 30 minutes or so until my order was ready. For some reason faxes were coming in (or people were accidentally dialing the fax number) pretty often, which kept throwing people out of “the zone”, slowing down everything.

If the fax number is only used for faxes, why is that phone even ringing, why not just mute it? And why is it hung on the wall right next to the regular phone? This is a good real-world example of what it means to make data accessible and coherent. Even the veteran employees there err sometimes and pickup the fax phone, so what happens to the new ones? They must get totally confused by this.

Think of the phones as organizational master data. Should everyone be exposed to the fax phone? Wouldn’t it make more sense to just let the few people who actually get faxes have access to it? Why confuse people by placing two identical phones when one should never be used? I often say that master data management is more than consolidation and data quality improvement — it’s also about making data more accessible and coherent. SAP MDM provides many tools to make data clearer to your users. Don’t replicate the existing complex data structure, try to improve it, clean it, make it simpler to use and to understand.

  • Translate field and table names if your users speak different languages.
  • Don’t use codes — use Lookup tables or remote-keys to let the user select values from a list of real words instead of codes, and save the codes in the background.
  • Use MDM’s rich set of field types — why have 2 fields called “Length” and “Length UOM” instead of a single Measurement field called “Length”?
  • Use the Portal iViews to make the data stored in MDM accessible (even as read-only) for many users via a web-based interface.

The list goes on. The bottom line is — don’t just consolidate, improve. Make the data more accessible, empower users, make it easier for new-comers to dive right in. MDM has all the tools to do it, use them.

The Times They Are A-Changin’

January 21, 2009

When I teach SAP MDM courses I often get asked if MDM can handle time-dependencies, with the most common request being to set values to change according to the current date. The short answer is no — while MDM has no problem storing date or time stamps it can’t act on them. The long answer is — anything is possible, with some creativity. The solution I provided to one of my clients doesn’t only use MDM features, but who ever said we must confine ourselves to what MDM can do on its own?

One of my customers wanted to change a value of a certain field in the main table whenever a specific date reaches, and let each record be changed on a different date. What I did was add another field to the main table, called “Change Date”, to hold the date on which the value of the other field (let’s say its name was “Time Dependent”) needs to change on, and I used automated imports and the operating system’s task scheduler with a short script to update the records that needed to be change every day. Here’s how it works:

1. Add field to store date

Add a field to store the date you want to act on. In this example I’ll call it “Change Date”, but obviously you could also call it “Johnny Bravo” or “Shala Lala”.

2. Create an import map and inbound port

Once I had the fields in place I created a dummy source file to build an import map on. I created a CSV file, we’ll see why later — that looked something like this:

date,new value
6/6/2009,X

Using Import Manager I created a very simple Import Map based on this source file — mapped the  “date” source column to the field “Change Date” and “new value” to “Time Dependent”. In the Field Matching tab I told MDM to match on the Change Date field, and set the default import action to “Update All Mapped Fields”.

This setup meant that MDM will update the values of “Time Dependent” with the value from the column “new value” in the source file, but only in the records that have the date specified in the source file’s “Change Date” field. Once the map was created I defined an inbound port using that map.

3. Scheduling a nightly update

The next step was making sure records will be updated on the right date, automatically. To do that I used the operating system’s task scheduler — all operating systems have one, in Windows it’s the Task Scheduler, in UNIX it’s cron. That specific project used Windows, but the same thing can be done just as easily on a UNIX based system.

I created a scheduled task that ran a small, simple batch file. This batch file creates a CSV file to be imported (using the map created earlier) and places it in the inbound port’s Ready directory. The file looked something like this:

cd /location/of/inbound/port/Ready
echo date,new value > file.csv
echo %DATE%,X >> file.csv

If you are so fortunate to have never had the need to learn how to speak Windows Batch, I’ll translate: what this script does is create a CSV file that looks like the dummy file I used to create the import map, but it makes sure it has the current date as the value for the “date” field. It creates the file in the Ready directory of the inbound port, so it will be picked up by Import Server.

I scheduled the task to run everyday at midnight, and that’s it — every day all the records that are set to have one of their values changed that day will be updated using the import. You can extend this method to support multiple fields, and even multiple values, you just need a bit of creativity.

Touching records with Import Manager

December 23, 2008

I tried to figure out what would be a good first post for this blog. Should I introduce myself, tell what I do? Should I write about something basic yet often misunderstood like qualified lookup tables? I figured the people I will be directing to the blog’s first post already know me in one way or another, and that they already know what qualified lookup tables are. So I decided that for the first post I’m going to share an MDM tip, something I’ve been using for some time and I think others could benefit from greatly.

Import Manager (and Import Server) can be used to do more than just create new records or update exisintg ones with new values from some data source. I had a requirement in one of my projects to let users send a list of product IDs and automatically trigger syndication of records in MDM that matched these product IDs. The users were already using Excel files in the old process, so I figured the easiest thing for them would be to keep working with what they know. I also wanted to use standard features as much as possible and avoid custom coding to make things simpler, quicker and ultimatly cheaper.

The solution was quite simple: the first step was to create a workflow that simply triggers a syndication on all the records that were added to it. It has only Start, Syndicate and Stop steps, it’s configured to launch automatically and be triggered by an import.
The second step was to create a an inbound port that can accept the Excel sheets users create with an import-map that only matches product IDs from the Excel sheet to records in the repository, set the Default Import Action to update Null Fields only and avoid creating any new records, and configure it to launch the workflow.

This setup means that when a user drops an Excel sheet with a list of product IDs in the inbound port’s Ready directory Import Server will trigger the import-map, which will in turn “touch” all the records in the repository that match the product IDs in the Excel sheet and send them to the workflow. I say “touch” because no import is taking place and no value is changed in the records, they’re just touched and marked by Import Manager to be sent to the Workflow, which is kind of similar to what the UNIX touch command does. Since the workflow is fully automated the next step will be syndication of these records — which fully answers the requirement.

Since then I’ve used this trick for other things, the latest one was to easily implement time-dependent fields in MDM, which is what I will write about in the next post.

I hope you’ll find my posts useful, feel free to ask questions and give comments, I promise to do my best to answer back.

And most importantly, have a merry Christmas and a wonderful 2009!