Brent Martin

Subscribe to Brent Martin feed
Blog entries
Updated: 9 hours 16 min ago

Got Oracle? Better Manage Your Licenses!

Sat, 2014-08-02 15:55

There's actually quite a bit of information on the internet about the complexity in managing the technology licenses you purchase (and don't purchase) from Oracle.  This has been a problem for many versions, and is still a problem in the current version.  Here are some examples:


http://kevinclosson.wordpress.com/2014/07/24/oracle-database-12c-release-12-1-0-2-my-first-observations-licensed-features-usage-concerns-part-i/
http://kevinclosson.wordpress.com/2014/07/25/oracle-database-12c-release-12-1-0-2-my-first-observations-licensed-features-usage-concerns-part-ii/
http://mikesmithers.wordpress.com/2013/01/26/oracle-database-diagnostic-and-tuning-packs-exactly-what-are-you-not-licensed-for/
http://mikesmithers.wordpress.com/2013/02/09/sqldeveloper-and-a-very-expensive-query/


In these articles the following patterns start emerging:


Installing Oracle software enables extra cost "licensable" features that you may or may not be entitled to.
Some of these features can be disabled. Others cannot be disabled by any method in the Oracle documentation.
Regardless, Oracle databases track "usage" of all of these components in audit tables in the database.
If Oracle audits your company for license compliance, data in the database audit tables will be used as evidence and may make your company liable for compliance


I don't really want to debate Oracle's intentions with these practices, or whether or not they'd actually compromise their relationship with their clients to pursue accidental use revenue.  Oracle is a sophisticated organization and I think it's safe to assume this behavior is deliberate and is designed for their own benefit.  At best, these practices represent a risk for Oracle's customers that need to be mitigated.

So if you are going to purchase and deploy Oracle software - and you almost certainly will given Oracle's acquisition record -  you need to take steps to protect your company from potential license compliance issues down the road.

To be fair the items I'm outlining here make sense when you license and deploy enterprise software from any vendor.  But I'm picking on Oracle because I believe managing deployment of Oracle licenses is possibly the most complex challenge of its kind in the industry and it deserves special attention. 

Before we go any further I need to put out the standard disclaimer.  I am not a lawyer nor an Oracle licensing expert and I am not an authority on these topics. Do not take anything in this article at face value.  Validate everything with your own experts.  Hire your own consultants and attorneys who specialize in Oracle products and Oracle negotiation to help you through the process..  

Now that that's out of the way let's get started.  The areas I think you should focus on are:


Initial contract negotiations
Implementing an IT license tracking solution
Create a compliance process that involves your strategic sourcing team and the technology folks who work with the products.


Reducing Compliance Risk Starts With The Contract.
Once you select a set of products that you want to purchase from Oracle, the negotiations start.  Oracle typically extends relatively large discounts off the software list price (I've seen 50%-95% depending on the size of the deal).  While that may seem like a lot and I agree it is very important, here are some other things besides price that are equally important to negotiate for:



Oracle changes the terms of their licensing from time to time. This will not be to your benefit. Be sure to negotiate contract terms that lock in specific license metric definitions and license terms.  And be sure to include language that protects you from the "click-through" agreements.
Along the same lines, be sure to create a Master Services Agreement that will cover future purchases so that you don't have to re-negotiate these terms with each incremental purchase.
Don't allow Oracle to audit your company for license compliance.  In my opinion they shouldn't have the right to show up and audit you unless they have evidence that you are out of compliance.  Be sure to negotiate these terms carefully because it could save you from a ton of trouble later.
Do include language that states if licensable components are installed without a specific notification in the installation program or in the software itself to inform someone that they are being installed then your company is not liable for any usage.
Do not agree to use Oracle's scripts that will crawl your servers and detect usage.  Agree to provide this information from your own tracking system.

Deploy a License Tracking System
OK so hopefully you'll start out with a contract that protects you from things like accidental deployments and random audit demands.  The next layer of protection involves a system that can keep track of all of this for you soRead More...

KScope 13 - Wednesday

Thu, 2013-06-27 05:14

I just made it back from the Mardi Gras celebration and my decision to blog at this point may not be my smartest decision ever but I feel like it’s now or never so here it goes.


Tom's Top Twelve Things About the Latest Generation of Database Technology


Tom Kyte


Started the morning with Tom Kyte and his session on Oracle 12C.  Say what you want about Oracle, but they keep innovating and improving their database platform.  There are a lot of features that we’ve been waiting for a long time (like defaulting to a sequence, default if null, etc) and being able to embed PL/SQL inline in an SQL is way cool.  But I was blown away with the new Row Pattern Matching functionality, and the Pluggable Database options will be very valuable to cloud providers and other companies who like to consolidate databases onto shared hardware and offer database as a service to the company.  Download the deck and get familiar with the new features.


OBIEE and Essbase Integration with BI Foundation Suite 11.1.1.7


Mark Rittman


Who knew Mark Rittman was from the UK?!  Anyway he had a great presentation about how to integrate Essbase with OBI, what kinds of problems you might run into, and what works well as well as what doesn’t.  Most of this stuff has been blogged about on the Rittman Mead blog so if you missed the session don’t worry.


What I heard was that BI Foundation 11.1.1.7 was just released back in April, but it has a lot of features that may make it work upgrading to.  Smartview can now report from the OBI logical model.  A lot of the Essbase integration wasn’t all that new (we’ve been able to plug Essbase into OBI’s semantic layer for some time), but with 11.1.1.7 Oracle has aligned the security managers. This allowed Oracle to get past some technical challenges around the integration and now the Essbase workspace can run inside of OBI.  It’s cool that Financial Reports can also run within this workspace so we’re getting closer to an integrated reporting view.


There’s also a developer preview where you can spin an Essbase cube off of OBI content.  It doesn’t have a lot of options, but it works and automates most of the steps that previously would have been manual.  Good stuff.


Beyond the Import Metadata Wizard


John McGale


John continued with the Essbase/OBI integration theme, but he was coming from the OBI side and had some great slides about how the Essbase and OBI disciplines were different.  I’m not an expert in either one, but what I heard that the data warehouse (OBI) discipline is all about star schemas and the Kimball methodology.  Essbase is all about how accountants think about their financial data and data is stored in hierarchies at the intersection of dimensions.  I might have heard him say that OBI can outperform Essbase.  Unfortunately I had to duck out for a conference call and to avoid a fistfight that had broken out but I do plan to download the slides just to see what I may have missed.


Streamline your Financial Close with an Integrated Tax Provision


Janette Kosior


New Orleans can be a crazy place, and after a night on Bourbon Street I thought I had seen it all.  Then I went to the HFM Integrated Tax Provision session.  Wow.


Ok, I shouldn’t give the tax accountants a hard time.  Heck, I doubt if any are actually at this conference since most of the audience (the ones wearing clothes anyway) were Hyperion admins.  But something about the topic had everybody in a party mood.


What happens in the Integrated Tax Provision session stays in the Integrated Tax Provision session, but I’m going to bend the code a bit to tell you something about this new application from Oracle.  First, it’s all built on top of HFM.  And why not? HFM is the perfect platform to leverage the dimensionality, complex business rules and robust reporting needed to deliver nice tax provision functionality.


So it’s deployed as a separate HFM-on-steroids application.  It has to be kept in sync with HFM which can be done through EPMA, which is more than you can say for CorpTax or your spreadsheets.   This is an initial release, but the speaker seemed confident that all of the bells and whistles needed were there and the system is rock solid.  There’s even a Financial Management Analytics Module that will let you do some awesome tax reports and dashboards.


On the down side, you have to buy a new license and pay maintenance on it (buzz kill), and the analytics module would be yet another license and maintenance (major buzz kill).


This is part of Oracle’s close manager, which automates your end-to-end close process and helps you build a world-class close process.  And that's enough to make accountants want to  P A R T Y!!!


 

Read More...

KScope 13 - Monday

Tue, 2013-06-25 05:39

This week I'm in the Big Easy hanging out at ODTUG’s Kscope 13 conference. I have heard this is the best conference for Hyperion and Oracle Business Intelligence so my expectations are high. I’m not as strong in either topic as I am with PeopleSoft but there’s nothing like a good conference to get up to speed.


So here's a quick run down of the sessions I was able to attend today along with a quick synopsis:


Balance Sheet Reporting in HFM 6/24/13


Alexandre Seran


Balance sheet analysis is often done outside of HFM in Excel.  CTA is often done in the same way.  This is a problem because of tight timelines and accuracy concerns.


OK You can’t automate everything. There will always be entries that have to be manually made in the financial statements. But this doesn’t necessarily mean Excel. Alexandre presented a methodology to capture the manual adjustments (and supporting documentation) in Webforms and/or Smartview and create proof of CTA, cash flow statements for each entity, and a detailed balance sheet movement report all within the boundaries of HFM.


This session went pretty deep into the business requirements of balance sheet analysis and reporting which was mostly over my head (I admit), but I left with an understanding that a good HFM architecture and design lays the foundation for streamlined automated reporting at month end, and without that foundation you may find yourself with a highly manual reporting process at month end.


Using OBIEE to Retrieve Essbase Data


Edward Roske and Tracy McMullen


Oracle has done a lot of work to integrate Essbase into everything.  OBI is no exception.  But integrating a relational tool like OBI with an OLAP tool like Essbase requires some planning and effort.  This session did a great job laying out what’s required to make this integration happen and boiling it down into 7 steps.  Ok the 7 steps shouldn’t be taken TOO literally but definitely download this presentation or check out my favorite Hyperion Planning authors’ blog(s) if this is something you’re interested in.


Introduction to the Essbase Java API


Tim Tow


Tim is an admitted Essbase geek and self-taught Java programmer.  Not too shabby for someone with an accounting background.  This session focused on the Essbase Java API and what you can do with it.  It included specific programming examples and use cases.  While I personally think he spent way too much time on the ins and outs of Java programming (what is a class, what makes a good IDE, etc), the session provided a detailed view of what you can do with the Java API.  It’s good to know we’re not limited to .NET and Visual Basic to working programmatically with Essbase anymore.


Automating Hyperion Reporting for the Rest of Your Organization


Jim Wilking


This session turned out to be a session about how to use delivered Smartview VBA functions to do generate advanced Smartview reports.  Topics included how to add prompts to the report, how to refresh specific sections of the report based on those prompts, etc.  And if you have a lot of reports to run, Jim showed how that could be automated as well.


VBA in Excel always demos well (especially to accountants!) and there’s nothing like creating custom automation for a specific task to improve efficiencies.  But the functionality in this presentation could also be met with Hyperion Financial Reporting as far as I could tell.  Jim did point out that there are disadvantages to a high dependence on VBA to generating your monthly reporting package.  Having to maintain a set of reference values in Excel and having to keep advanced VBA skills on your team were a couple he called out.  From experience I know what a headache it can be to upgrade when you have a lot of custom VBA code to re-test so I would add that to the list.


Summary


All in all it was a good day and I definitely left with a lot to think about.  Looking forward to day 2 tomorrow!


 

Don't Pave the Cow Path

Thu, 2013-04-18 22:09

Lately at work I've been having to remind myself lately not to pave the cow path -- not to bring existing business processes and technology forward just because that's how we've always done it.  Here's the poem that the expression comes from.


The Calf-Path

by  Sam Walter Foss  (1858-1911)

One day, through the primeval wood,
A calf walked home, as good calves should;
But made a trail all bent askew,
A crooked trail, as all calves do.


Since then three hundred years have fled,
And, I infer, the calf is dead.
But still he left behind his trail,
And thereby hangs my moral tale.


The trail was taken up next day
By a lone dog that passed that way;
And then a wise bellwether sheep
Pursued the trail o’er vale and steep,
And drew the flock behind him, too,
As good bellwethers always do.


And from that day, o’er hill and glade,
Through those old woods a path was made,
And many men wound in and out,
And dodged and turned and bent about,
And uttered words of righteous wrath
Because ’twas such a crooked path;
But still they followed — do not laugh —
The first migrations of that calf,
And through this winding wood-way stalked
Because he wobbled when he walked.


This forest path became a lane,
That bent, and turned, and turned again.
This crooked lane became a road,
Where many a poor horse with his load
Toiled on beneath the burning sun,
And traveled some three miles in one.
And thus a century and a half
They trod the footsteps of that calf.


The years passed on in swiftness fleet.
The road became a village street,
And this, before men were aware,
A city’s crowded thoroughfare,
And soon the central street was this
Of a renowned metropolis;
And men two centuries and a half
Trod in the footsteps of that calf.


Each day a hundred thousand rout
Followed that zigzag calf about,
And o’er his crooked journey went
The traffic of a continent.
A hundred thousand men were led
By one calf near three centuries dead.
They follow still his crooked way,
And lose one hundred years a day,
For thus such reverence is lent
To well-established precedent.


A moral lesson this might teach
Were I ordained and called to preach;
For men are prone to go it blind
Along the calf-paths of the mind,
And work away from sun to sun
To do what other men have done.
They follow in the beaten track,
And out and in, and forth and back,
And still their devious course pursue,
To keep the path that others do.


They keep the path a sacred groove,
Along which all their lives they move;
But how the wise old wood-gods laugh,
Who saw the first primeval calf!
Ah, many things this tale might teach —
But I am not ordained to preach.


 

PeopleSoft 9.2 to be released March 22

Mon, 2013-03-18 15:33

Oracle announced today at the Alliance '13 conference that the PeopleSoft 9.2 will be generally available on March 22.


Here's the link to the press release:  http://www.oracle.com/us/corporate/press/1920557

Changing your PS Database Platform: Cutover

Sun, 2013-02-17 00:33

So far I've written about how you might approach the plan, design, build and test phases of a PeopleSoft replatforming project.  This time around I'd like to spend some time on the Cutover.


You’ll probably want to do at least 4 mock cutovers. One to build the initial development environment on the new hardware.  One to start System Test. One to start User Acceptance Testing, and a “dress rehearsal” to prove out your cutover plan/strategy. 


Start the cutover plan when you do your first migration. Capture tasks and timing. And continue to refine it with each additional mock cutover.


For the 3rd mock cutover, include items in the cutover plan for communication, external systems that will need to be modified and moved in parallel, shutdown sequence for batch, expected timeline, contact lists, etc.  By now your communication plan should be fairly explicit and there should be no surprises from the extended IT team or the business as to what will happen and when.


One to two weeks prior to cutover, execute a “dress rehearsal” where you actually move your production database in as realistic of a fashion as possible.  Validate your final timings and make sure nothing was missed.


 Two words about cutover communications:  They’re important.  You need to keep all of your stakeholders informed of where you are in the cutover, raise any issues quickly, and insure all of the hand offs are executed cleanly with no loss of time.  Identify a single point of contact (or contacts if you’ll be running cutover around the clock) who can get status from the team members without bugging them too much and prepare regular communications to the interested stakeholders.   


 In addition, you’ll probably want to maintain two open conference call bridge lines:  One for executive/stakeholder updates, and another to allow your technical teams to quickly collaborate on handoffs or issues that arise.


 A good cutover plan will include a final “Go/No-Go” decision point prior to starting any cutover activities.  If you have no “Severity 1” or “Showstopper” issues the cutover should proceed on schedule.


 Now the cutover plan becomes the script for everything over the next hours and days.  A common scenario follows:  Users close out transactions.  Batch schedule is stopped in a controlled manner. Final interface files are sent.  Validation reports are run that users will use to validate the system when it comes back up.  Finally user accounts are disabled, the application is stopped, and the DBA team (who is hopefully caught up on sleep) takes over.


 Now the DBA team executes the data migration using whatever tool you decided on.  Row count reports and other validation will be executed when it’s complete and the PeopleTools upgrade will start on the database.  This can be the longest part of the process.  Then all of your customizations are migrated in, the application is configured and a non-destructive technical checkout is conducted.


 It’s typical at this point to allow a limited number of users log in and enter and process real production transactions. This allows any problems to be identified and resolved before the system is turned over to the larger user population.


 Finally we’re ready to go.  Get your project sponsors and executives on the phone for a final Go/No-Go decision.   Once you get the green light, unlock all of the users and start your batch schedule back up in a controlled manner.  Congratulations!  This is a big accomplishment!!

Changing your PS Database Platform: The Test Phase

Mon, 2013-02-11 02:51

So far I've written about how you might approach the plan, design, and build phases of a PeopleSoft replatforming project.  This time around I'd like to spend some time on the Test phase.


Just because you’re only changing some SQL around doesn’t mean that you can take shortcuts with testing.  You’ll want to run an entire stem-to-stern test of your PeopleSoft system to insure that everything works as expected.


 One thing to keep in mind:  99% of your defects will be with custom and customized code.   The delivered code won’t generate nearly as many problems, so if you’re deciding where to spend your testing resources definitely focus on the custom processes.


 As I mentioned in the Build phase, Unit Testing is critical.  It’s arguably the most important testing that you will do so come up with a mechanism to track unit test results with the objects modified and make sure you have 100% custom code coverage.


 Testing the data migration process is important too.  Databases differ in subtle ways and you’ll want to make sure your normal and extended character sets make it across correctly.  Naturally you’ll want to run row count reports, but you’ll need to go beyond that.   You’ll want to make sure the data in the underlying tables are identical and nothing has been lost in translation.  One simple way is to use ODBC to join to tables in the source and target databases and create queries that insure the individual columns are identical.  Unfortunately that approach is extremely slow.  Another approach is to run hash algorithms on key character fields and summarize the totals, comparing the results on both the source and target database.


System/Integration Testing is also very important.  For one thing, you’ll want to have confidence that the system behaves the way you expect it to.  And interfaces will generate problems themselves.  One common problem is that your interface programs probably assume the default date format for a database platform, and interfaces that don’t specify a date format can choke when the incoming date format doesn’t match what’s expected, or they can send the wrong date format to an output file.  Switching from a Non-Unicode database platform to Unicode can cause other problems.  You’ll want to execute all of your interfaces and make sure results are valid. 


User Acceptance Testing is important as well.  Users who know the processes should spend some time in the system making sure all of the critical functionality works and they feel comfortable everything is working as expected. They’ll need to be on the lookout for date format issues, performance issues, data entry discrepancies, etc.  They should also spend quality time with their reporting to make sure no new errors were introduced during the build phase.


And finally Performance Testing should be conducted to flesh out any new problems introduced by the new platform.  The performance testing should include Online Performance testing,  Batch performance testing, and if possible data from PeopleSoft Performance Monitor should be captured and compared to baseline performance data from the old system. 


Online performance testing is typically conducted using a tool like LoadRunner or Rational Performance Tester.  You record scripts based on a set of highly used business processes and play them back in volume.  While the test is executing your admin team will monitor system performance on the various servers and look for bottlenecks.  At the end of the day this is necessary for a performance testing effort (especially if you’re making multiple changes like migrating to new hardware and/or a PeopleTools release).  However, it’s not sufficient to identify all performance issues.


One of the big deficiencies of online performance testing is that it doesn’t test batch jobs, or if it does test them it is very limited.  For batch testing, you’ll want to define a testing mechanism that is realistic according to real, observed jobs that run in your production environment.  You’ll want to make sure that the jobs have real data to process.  And you’ll want to make sure the jobs are sequenced in such a way that dependencies are tracked to some extent.  After all of that, you’ll want to come up with a way to track and report results.  I’ll write more about the approach and toolset I’ve used to get realistic batch tests in the future so stay tuned. 


The other deficiency of online performance testing is that PeopleSoft is just too complicated of an application to expect a robot to crawl all of the web pages looking for slowness.   Sure, during yoRead More...

Changing your PS Database Platform: The Build Phase

Tue, 2013-02-05 03:05

In last two postings I wrote about how you might plan a project where you migrate your PeopleSoft application from one database platform to another, and how you might approach the Design phase.  I wanted to share my thoughts about the Build phase in this article.  I'll share my thoughts about the Test and Cutover phases in my next posting(s).


The Build Phase


The Build Phase is always my favorite part of any PeopleSoft project, probably because I come from a Development background.  The Build phase of a replatforming project is in some ways very straightforward, and in some ways it is more difficult.  The problem isn’t in the coding changes – it’s not too difficult to make a piece of SQL work on a different database platform -- the challenge is in Unit Testing.  Every SQL that is touched must be unit tested, and that will be the biggest part of the effort.


Most developers are used to unit testing their own work.  But it's a good idea to use a code and testing review where developers document each object change and unit test and another developer reviews the results.  Since there will be many small changes, the documentation requirement should be light, but it should include a trace file that proves that each App Engine step, PeopleCode SQL, and SQR Function was executed was tested.  How structured your process is will depend on the size and location of your team.  Insuring quality with process and documentation might not be as important in a small shop, but is critical to your success if you have a large development team located off shore.


Unit testing is the only opportunity you’ll have to actually test each piece of modified code.  Subsequent phases will test the system overall, but you will probably not achieve 100% code coverage.  Fortunately, almost all of your defects can actually be caught in unit testing of a replatforming project so you should use this to your advantage.  Defects that get missed will haunt you in later testing phases where they’ll be more visible and more expensive to fix.


Also as part of this phase, your DBA team should  execute another mock cutover using the tools and steps you decided you will use for the real cutover.  The resulting database (plus the code generated in the Build phase) will be the starting point for your first test database.


And the testing team should start building the test scripts for the subsequent test phases here.  Since we’re not changing application functionality, they should be able to leverage existing scripts from prior upgrades or implementation and enhance them for functionality that was added since the original scripts were created.

Changing Your PS Database Platform: The Design Phase

Sun, 2013-01-27 22:52

In my previous article I described how you might approach planning and requirements gathering for a PeopleSoft database replatforming project.  You would exit that phase with solid RICE object inventories and an idea of any new PeopleTools functionality you will deploy.


I’d like to take you through some thoughts and considerations for the remaining phases (Design, Build, Test, Deploy).  This article will just focus on the Design phase and i'll publish the other articles in the near future.  For this discussion, I’m going to assume we’re doing a PeopleTools upgrade along with the replatforming effort, and we’re going to keep existing functionality/features/customizations the same as the current state.  This is usually a good idea because the technology changes will be challenging enough. 


The Design Phase


You might think a basic replatforming project doesn’t require a lot of design.  Once you have the object inventories from the planning/requirements gathering phase you have enough information to start modifying the SQL to make it work on the new database platform.   The one thing I would suggest though would be to bundle the raw object lists into logical chunks of work.  For example, if you have a component, page and process that work together to execute a process you should bundle them together so a developer can unit test all of them at the same time.  If you want to deploy new PeopleTools features you’ll want to spend some time deciding which ones will be useful and how you will configure, test, and deploy them.


But there’s a bit more work you need to do in this phase.  First, you’ll want to identify any external system that might need to be updated as well.  Any system that uses database links or has dependencies on your current PeopleTools environment (think Component Interface libraries) will need to be investigated to determine the impact and appropriate action to take.


Another decision you’ll need to make is with reports and queries.  You probably have a LOT of public queries, and you may have a lot of reports.  nVisions in particular seem to multiply if you don’t have strong governance processes to limit users to specific standard reports. 


So how do you deal with this situation?  It’s not always cost effective to upgrade and test every one.  Here are a couple of suggestions to manage this problem:


1)   Ask your users to provide their business critical reports and queries.  This will be the list that you “certify” that they will work correctly and perform well on the new platform.  You’ll spend whatever time is necessary during development and the testing phases to make very sure that these queries and reports are defect free.


2)   Identify all of the reports and queries that have been run in the last couple of years via process scheduler and query audit tables.  All of these will be your priority 2 set.  Scan this set using automated techniques to identify problems, correct any that fall out, and unit test everything that is modified.  Be sure a good percentage of these are tested in future test phases and give users an opportunity to test them during user acceptance testing.


3)   Other reports and queries won’t be touched.  Breaks here will be handled by your post go-live break-fix process.


The Design Phase is also when you should prepare your testing plan and your communication plan.


While this phase is progressing, your DBA team should execute an initial replatform from the database on the old platform to the database on the new platform.  For this exercise, we’ll just use data mover to extract every table on the source database platform to a flat file, and to import it to the new database platform.  Once on the new DB platform you’ll need to manually adust tables like PSDBOWNER, PSOPTIONS, etc.  Execute the PeopleTools upgrade (if necessary) and you’re done. Don’t expect this to go quickly the first time around – allow yourself 2-4 weeks in the schedule.  And capture all of the steps in detail because it will be the start of your cutover plan.  The environment this exercise produces will become your new development environment so that you can start your build phase. 


Also during this phase you should make decisions about the technologies and tools you’ll use to make your job easier.  One is Oracle GoldenGate.  GoldenGate is a Change Data Capture tool that supports multiple database platforms.  It gives you some amazing capabilities around extracting table data from one platform in parallel, shipping the extracted files to a target file server, and importing them in parallel while the extract is stiRead More...

Changing Your PeopleSoft Database Platform

Sun, 2013-01-20 01:43

There are several reasons why you might decide to migrate PeopleSoft to a new database platform.  It could be that you need to move to a more scalable platform.  Or you may be looking to reduce your annual license costs and want to switch to a more cost-effective platform.  Whatever the reason, you can certainly take advantage of PeopleSoft's multiple-database support and change your database platform.  This article will give you some ideas about how to plan the effort.


One of the first things to consider is whether or not you want to upgrade to the latest version of PeopleTools.  This may be required, especially if you want to deploy the latest version to the database platform you’re migrating to.   If this can be done in advance of the replatforming that makes the actual replatforming easier, but it does require it’s own testing and deployment cycle, and depending on the age of your current database platform it might be impossible.  In that case you will have to do the tools upgrade along with the database replatforming. 


 If you’re upgrading PeopleTools as part of the upgrade, you need to decide if you’ll want to introduce the new look and feel of PeopleTools roll out new PeopleTools enhancements to your users.  Review the release notes to get a good list of new features and enhancements, carefully choose what will be deployed as part of your scope.


 You’ll probably need to purchase new hardware to run your new database.  This may also be a good time to do a larger hardware refresh on your web/app/batch tests.  If you’re adding new hardware, be sure you size the hardware to run the new database according to your hardware vendor’s recommendations. They all have PeopleSoft-specific sizing questionnaires and will be happy to assign an engineer to assist you with your needs.  Also give yourself adequate time in your project plan to procure the hardware and have it installed and configured.


 Minimize non-technical changes. This isn’t the best time to change business processes or implement new functionality.  It’s not even the best time to de-customize.  If you have to do it, plan for the additional testing and change management (training/communication) that will be required.


One of the first things you should do to plan out this effort is to get a good list of the objects that will need to be updated as part of the replatforming effort.  We can assume that the delivered code will work on whatever supported database platform you want to migrate to, so we can focus much of our level of effort analysis on custom and customized code.


Processes, Reports and Queries


Get a list of all process and reports from the process definition tables.  Use the process scheduler tables to flag which processes have been executed in the last 2 years or so.  This will serve as your active processes list and will define the in scope processes for this effort.  Any processes, reports and queries discussed should be filtered against this active process list to prevent spending time on unused or unneeded processes.


To capture which nVision and Crystal reports are being run you’ll need to join PSPRCSRQST to PSPRCSPARAM and pick out the report name from the command line stored in the ORIGPARMLIST field.


PeopleCode Analysis


·         Extract the custom Peoplecode from PSPCMPROG to a flat file using  DecodePeopleCode.sqr.   Run another program to extract all SQLEXEC statements, and execute each one against the target database platform.   Be sure to flag any App Engines as being Used in your process list if you find active application engine PeopleCode.


Views Analysis


Analyzing views are a bit more straightforward.  Use App Designer or Data Mover to build ALL views on the target database platform, and identify which ones error out during the build. 


SQL Object Analysis


Extract the custom SQL from SSQLTEXTDEFN into a flat file and search it for specific keywords that aren’t going to work on your current platform.  For example, if you’re migrating from SQLServer to Oracle you might look for keywords like getdate, nolock, datediff, dateadd, day, month, year, str, left, right, +, ..,*=, datepart,isnull,convert,select top, len, inner, outer, .dbo.,xact_abort,implicit,round. 


This approach still requires a human touch.  Some of the SQL flagged to be modified may be fine because the keywords may be valid PeopleSoft functions, which are prefixed with “%”,  eg Dateadd. 


 Also keep in mind different database platforms have different conventions for outer joins, allowing subquery joins, etc.  This type of syntax different is very diRead More...

Amazon Web Services is Ready for the Enterprise

Thu, 2011-08-04 15:49

Amazon has been steadily moving toward making their web service offering ready for the enterprise. Over the last year or so they've received certification for Oracle database, they've broken down the barriers that would prevent PCI certification, and they've improved their pricing structure to make it more corporation-friendly.


Today they may have finally broken the final barriers down to large scale enterprise adoption with the following announcements:


Virtual Private Cloud is now out of Beta and allows you to "provision a private section of the AWS cloud where you can create avirtual network that you control, including selection of an IP address range, creation of subnets, and configuration or route tables and network gateways. you can connect your Amazon VPC directly to the Internet while also extending your corporate data center to the cloud using encrypted VPN connections."


But the announcement of Amazon Direct Connect might be my favorite. "Amazon Direct Connect is a new service that enables you to bypass the internet and deliver data to and from AWS via private network connection. With a private connection, you can reduce networking latency and costs, and provide a more consistent network experience while moving data between AWS and your datacenters. With pay-as-you-go pricing and no minimum commitment, you pay only for the network ports used and the data transferred out from AWS over the private connection."


There's also new functionality for AWS Identity and Access Management that lets you use your existing corporate identity management system to grant secure and direct access to AWS resources without creating a new AWS identity for those users.


I'm excited about the possibilities this opens up in terms of on demand computing capacity in the enterprise.

Managing Chartfields and Trees Across PeopleSoft and Hyperion

Tue, 2011-07-26 04:07

If you’re implementing Hyperion applications to complement your PeopleSoft Financials application, one decision you’ll have to make relatively early is which tool to use to manage your core dimensions and their associated hierarchies.  Here are the options:



Native Functionality
Hyperion EPMA
Hyperion Data Relationship Management

So which one is the right choice?  Based on my research and discussions with Christopher Dwight, a member of Oracle’s Master Data Management practice, here’s what I have learned:


The native functionality basically means you’ll maintain your dimensions in each application separately.  So if you want to add a department, you’ll have to add it to PeopleSoft, then Hyperion Financial Management, then Planning separately.


Hyperion EPMA provides a robust, single point of administration for EPM applications.  It allows you to create a dimension library which allows several EPM dimensions to be centrally stored and re-used across multiple EPM applications.  Basic dimension editing capabilities are provided.  Individual dimension elements ("nodes" or "members") can be flagged for use within a specific application, supporting slightly different application requirements while promoting dimension re-use.  Although this feature has potential, each member must be individually flagged, limiting the usability for large dimensions.  EPMA is intended to support only Hyperion EPM applications, and to be utilized by system administrators, not the typical end user.


DRM is different in that it was conceived from the start as an agnostic enterprise dimension management platform, and not beholden to Hyperion EPM applications alone.  As such, DRM can be deployed to support financial metadata and dimensions in a myriad of systems, ranging from PeopleSoft to GEAC to SAP to Cognos to Teradata to Hyperion and many more.  It was also design to support not only system administrator users, but also to allow business users to become direct contributors into the dimension management process.
Read More...

Data Warehousing Made Easier

Thu, 2010-12-16 04:05

I was talking to an Oracle sales rep this week about OBIEE.  Since this is a PeopleSoft blog I guess I’d better explain.  OBIEE is Oracle’s Business Intelligence offering.  It’s solidly in Forrester’s leader’s quadrant and it has all of BI features you would expect (reporting, ad-hoc analysis, dashboards, alerts, etc). The question at hand was why we should implement it when we already have some perfectly good BI tools with committed users who truly believe in them.


We didn’t get into a deep discussion about BI features.  Everybody knows what a BI solution should do by now, and the leading tools do it well enough that there’s not much differentiation (at least from how I interpret the Forrester recent report).  So what difference does a BI tool make at this point?


I’m not sure what the Cognos or MicroStrategy reps would say, but Oracle laid out an interesting case.  If your enterprise applications are built around Oracle applications like PeopleSoft, Hyperion, Siebel, or EBS and Essbase, you can get out of the box functionality that no other tool can match. 


Read More...

What I expect from Oracle Open World 2010

Thu, 2010-09-02 07:16

I’m going to Oracle OpenWorld again this year, and I just finished building my schedule.  Wow.  This year there are so many sessions I want to attend in the same time slots that I won’t be able to see a fraction of what I want to.  Guess I’ll have to skip the session on how to author blue-ray disks using Java in favor of a product roadmap session I need to attend.  You see, I’ve started a new project and I have a whole laundry list of stuff I need to come up to speed on.  I’m sure it’ll be an informative but exhausting week – it always is.


Gazing into my crystal ball, I’m expecting to hear more about the Fusion applications that were introduced at the end of the 2009 OOW.  I think Oracle isn’t re-inventing all of the functionality in their mature ERP/CRM product lines like PeopleSoft, JDE, EBS, and Siebel.  But all the same I’m expecting to see some products that are ready for launch and looking snazzy with the deep integration with BI and other apps that Oracle has invested so heavily in.


Speaking of BI, I’m looking forward to seeing the new release of OBIEE.  BI apps just look cool, and their functionality makes things like PeopleSoft Matching functionality seem boring in comparison.  I'm hoping to see support for a ton of data sources and the ability to publish interactive reports to latest generation mobile devices.  Unfortunately I think I missed the BI boat at some point in my career, so bring on the 3-way match!


 


The True Cost of a Core

Thu, 2010-08-26 07:21

Servers are becoming more powerful as manufacturers are finding new ways to get more cores into a CPU.  Today it’s not uncommon to see hexa and octa-core processors shipping at the same price points the dual- and hexa-cores shipped yesterday.  Where manufacturers once got their performance improvements through raw CPU speed, they are now getting their getting the majority of performance improvement through more cores in their processor chips.


Unfortunately the economics of additional cores for performance aren’t the same as improvements through improved clock cycles because software manufactures have largely tied their technology licensing to the number of cores on a system, and their pricing isn’t decreasing as the number of cores on these new servers increase.


For example, say you buy a basic server with two hexa-core processors, so you’re looking at 12 cores on the box.  Now let’s suppose the list price for Oracle Database is $47,500 per core.    So your list price to run an Oracle database on your new server will be $285,000.  And that’s not counting tuning packs, diagnostic packs, management packs, or even maintenance -- which is calculated as a percentage of the base price.  It turns out the cheapest part of this equation may be the hardware!


So if you’re planning on running software from the big vendors, conduct a solid sizing exercise and be sure to buy just the number of cores that you need.  Leave empty sockets for growth, but you might want to choose models that let you scale with fewer cores to avoid breaking the bank. Avoid sharing servers with more than one software package that is licensed per core (i.e. Informatica and Oracle DB), or you could end up paying double for server capacity that you’ll never be able to fully realize.  And when you DO add cores, be sure to also purchase the additional licenses to stay in compliance.  I’ve heard that software vendors’ compliance teams occasionally check up on you, and running with a few extra cores could break more than your annual budget.