Monday, May 7, 2012
Saturday, November 6, 2010
I M Good Leader due to my aspects of life and portfolios.
1) My Integrity is the integration of outward actions and inner values.
2) My Dedication means spending whatever time or energy is necessary to accomplish the task at hand.
3) My Magnanimous leader Skill ensures that credit for successes is spread as widely as possible throughout Profitable.
4) My Leadership with humility recognize that they are no better or worse than other members of the team.
5)My Openness means being able to listen to new ideas, even if they do not conform to the usual way of thinking.
6)Creativity is the ability to think differently, to get outside of the box that constrains solutions.
7)My Fairness means dealing with others consistently and justly.
8)My Assertiveness is not the same as aggressiveness.
9)A sense of humor is vital to relieve tension and boredom, as well as to defuse hostility.
Theses Actives Given through taken for me a good Leader.
Leaders make things happen by:
* knowing your objectives and having a plan how to achieve them
* building a team committed to achieving the objectives
* helping each team member to give their best efforts
* As a leader you must know yourself. Know your own strengths and weaknesses, so that you can build the best team around you.
"Good business leaders create a vision, articulate the vision, passionately own the vision and relentlessly drive it to completion."
"Skill in the art of communication is crucial to a leader's success. He can accomplish nothing unless he can communicate effectively."
– Norman Allen
Monday, November 23, 2009
I can look for people who are flexible when things change. I can cope with changing demands, uncertainty and stress. I can remain calm and composed. I can demonstrate that I have successful completed several projects or assignments with competing deadlines.
Teamwork & Collaboration
how I can do my work with others to achieve shared Business goals. I can make it respect
and value others' differences. I can easily build and maintain relationships with others. Do you offer support and help others and share your expertise with them to enhance the effectiveness of the team.
This competency focuses on how I m communicate with others. I can present oral and written information clearly, precisely and succinctly. I can match the way i communicate with the requirements of the situation and my audience. I can pay attention to listen carefully to others, asking questions when necessary to ensure understanding.
Drive to achieve
I need to be committed to success and accomplishing challenging goals. I can take the initiative to learn new skills that will be useful for my future career. I can learn about things beyond the
scope of your current job or assignment. I can prepare to put in as much additional time or effort as is necessary to ensure high quality results.
Creative problem solving
this is all about using ingenuity, supported by logical methods and appropriate analysis, to propose solutions to problems. I can conduct thorough fact-finding and analysis, anticipating any potential problems and then plan accordingly. I can think 'outside the box' when proposing solutions to problems. I can put forward new ideas for activities at Client Business Reporting and consolidation or work and offer innovative ideas to overcome challenges.
I m deadly look client focussed organisation so I need people working with me who share this focus and can anticipate their needs and respond appropriately. I can't think about 'clients' just in the sense of 'customers' - clients' can also be colleagues, study groups, maybe even lecturers. I can build rapport quickly and easily and think about a situation from their point of view. Based upon Business requirement I can recommend solutions that meet their needs. I can initiate to act with their satisfaction as top priority.
Passion for the business
this is all about being able to demonstrate a passion for our client Business Concern and the industry in which I operate. I know what does, and what recent achievements I have had. I can demonstrate knowledge of recent trends within the IT & Consulting industry. I can understand the message behind our current 'Smarter Planet” advertising campaign.
this is all about identifying and taking responsibility proactively for tasks and decisions in a timely manner. I can demonstrate when you’ve accepted responsibility for mistakes and worked to correct them. I can focus on resolving difficult situations rather than finding someone to blame. I can anticipate potential problems with a project and then plan accordingly, implementing decisions with speed, reliability and urgency.
Tuesday, July 28, 2009
The time consuming step is the first part. This step might take a long time to collect allthe delta information, if the FI application tables in the ECC system contain many entriesor when parallel running processes insert changed FI documents frequently.
A solution might be to execute the Delta InfoPackage to BI more frequently to processsmaller sets of delta records. However, this might not be feasible for several reasons:First, it is not recommended to load data with a high frequency using the normalextraction process into BI. Second, the new Real-Time Data Acquisition (RDA)functionality delivered with SAP NetWeaver 7.0 can only be used within the newDataflow. This would make a complete migration of the Dataflow necessary. Third, as ofnow the DataSource 0FI_GL_4 is not officially released for RDA.To be able to process the time consuming first step without executing the deltaInfoPackage the ABAP report attached to this document will execute the first step of theextraction process encapsulated. The ABAP report reads all the new and changeddocuments from the FI tables and writes them into the BI delta queue. This report can bescheduled to run frequently, e.g. every 30 minutes.The Delta InfoPackage can be scheduled independently of this report. Most of the deltainformation will be read from the delta queue then. This will greatly reduce the number ofrecords the time consuming step (First part of the extraction) has to process from the FIapplication as shown in the picture below.
The Step By Step Solution4.
1 Implementation DetailsTo achieve an encapsulated first part of the original process, the attached ABAP report iscreating a faked delta initialization for the logical system 'DUMMY_BW'. (This system can be named anything as long as it does not exist.) This will create two delta queues for the0FI_GL_4 extractor in the SAP ERP ECC system: One for the ‘DUMMY_BW’ and theother for the 'real' BI system.The second part of the report is executing a delta request for the ‘DUMMY_BW’ logicalsystem. This request will read any new or changed records since the previous deltarequest and writes them into the delta queues of all connected BI systems.The reason for the logical BI system ‘DUMMY_BW’ is that the function module used inthe report writes the data into the Delta Queue and marks the delta as already sent tothe ‘DUMMY_BW’ BI system.This is the reason why the data in the delta queue of the ‘DUMMY_BW’ system is notneeded for further processing. The data gets deleted in the last part of the report.The different delta levels for different BI systems are handled by the delta queue and areindependent from the logical system.Thus, the delta is available in the queue of the 'real' BI system, ready to be sent duringthe next Delta InfoPackage execution.This methodology can be applied to any BI extractors that use the delta queuefunctionality.As this report is using standard functionality of the Plug-In component, the handling ofdata request for BI has not changed. If the second part fails, it can be repeated. Thecreation & deletion of delta-initializations is unchanged also.The ABAP and the normal FI extractor activity reads delta sequential. The data is sentto BI parallel.If the report is scheduled to be executed every 30 minutes, it might happen that itcoincides with the BI Delta InfoPackage execution. In that case some records will bewritten to the delta queues twice from both processes.This is not an issue, as further processing in the BI system using a DataStore Object withdelta handling capabilities will automatically filter out the duplicated records during thedata activation. Therefore the parallel execution of this encapsulated report with the BIdelta InfoPackage does not cause any data inconsistencies in BI. (Please refer also toSAP Note 844222.)
4.2 Step by Step Guide1. Create a new Logical System usingthe transaction BD54.This Logical System name is used inthe report as a constant:c_dlogsys TYPE logsys VALUE 'DUMMY_BW'In this example, the name of theLogical System is ‘DUMMY_BW’.The constant in the report needs tobe changed accordingly to thedefined Logical System name in thisStep.
. Implement an executable ABAPreportYBW_FI_GL_4_DELTA_COLLECTin transaction SE38.The code for this ABAP report canbe found it the appendix.
3. Maintain the selection texts of thereport.In the ABAP editorIn the menu, choose Goto TextElements Selection Texts
. Maintain the text symbols of thereport.In the ABAP editorIn the menu, choose Goto TextElements Text Symbols
5. Create a variant for the report. The"Target BW System" has to be anexisting BI system for which a deltainitialization exists.In transaction SE38, click Variants6. Schedule the report via transactionSM36 to be executed every 30minutes, using the variant created instep 5.
This report collects new and changed documents for the 0FI_GL_4 from*& the FI application tables and writes them to the delta queues of all*& connected BW system.*&*& The BW extractor itself therefore needs only to process a small*& amount of records from the application tables to the delta queue,*& before the content of the delta queue is sent to the BW system.
*&---------------------------------------------------------------------*REPORT ybw_fi_gl_4_delta_collect.TYPE-POOLS: sbiw.* Constants* The 'DUMMY_BW' constant is the same as defined in Step 1 of the How to guideCONSTANTS: c_dlogsys TYPE logsys VALUE 'DUMMY_BW',c_oltpsource TYPE roosourcer VALUE '0FI_GL_4'.* Filed symbols.
DATA: l_slogsys TYPE logsys,
l_tfstruc TYPE rotfstruc,
l_lines_read TYPE sy-tabix,
l_subrc TYPE sy-subrc
,l_s_rsbasidoc TYPE rsbasidoc,
l_s_roosgen TYPE roosgen,
l_s_parameters TYPE roidocprms,
l_t_fields TYPE TABLE OF rsfieldsel,
l_t_roosprmsc TYPE TABLE OF roosprmsc,
l_t_roosprmsf TYPE TABLE OF roosprmsf.
* Selection parameters
SELECTION-SCREEN: BEGIN OF BLOCK b1 WITH FRAME TITLE text-001.
1.PARAMETER prlogsys LIKE tbdls-logsys OBLIGATORY.SELECTION-SCREEN:
END OF BLOCK b1.AT SELECTION-SCREEN.
* Check logical systemSELECT COUNT * FROM tbdls BYPASSING BUFFERWHERE logsys = prlogsys.IF sy-subrc <> 0.MESSAGE e454(b1) WITH prlogsys.* The logical system & has not yet been definedENDIF.
* Get own logical systemCALL FUNCTION 'RSAN_LOGSYS_DETERMINE'EXPORTINGi_client = sy-mandtIMPORTINGe_logsys = l_slogsys.* Check if transfer rules exist for this extractor in BWSELECT SINGLE * FROM roosgen INTO l_s_roosgenWHERE oltpsource = c_oltpsourceAND rlogsys = prlogsysAND slogsys = l_slogsys.IF sy-subrc <> 0.
MESSAGE e025(rj) WITH prlogsys.* No transfer rules for target system &ENDIF.* Copy record for dummy BW systeml_s_roosgen-rlogsys = c_dlogsys.MODIFY roosgen FROM l_s_roosgen.IF sy-subrc <> 0.MESSAGE e053(rj) WITH text-002.* Update of table ROOSGEN failedENDIF.
* Assignment of source system to BW systemSELECT SINGLE * FROM rsbasidoc INTO l_s_rsbasidocWHERE slogsys = l_slogsysAND rlogsys = prlogsys.IF sy-subrc <> 0 OR( l_s_rsbasidoc-objstat = sbiw_c_objstat-inactive ).MESSAGE e053(rj) WITH text-003.* Remote destination not validENDIF.
* Copy record for dummy BW systeml_s_rsbasidoc-rlogsys = c_dlogsys.MODIFY rsbasidoc FROM l_s_rsbasidoc.IF sy-subrc <> 0.MESSAGE e053(rj) WITH text-004.* Update of table RSBASIDOC failedENDIF.
* Delta initializationsSELECT * FROM roosprmsc INTO TABLE l_t_roosprmscWHERE oltpsource = c_oltpsourceAND rlogsys = prlogsysAND slogsys = l_slogsys.IF sy-subrc <> 0.MESSAGE e020(rsqu).* Some of the initialization requirements have not been completedENDIF.
LOOP AT l_t_roosprmsc ASSIGNING .IF -initstate = ' '.MESSAGE e020(rsqu).* Some of the initialization requirements have not been completedENDIF.-rlogsys = c_dlogsys.-gottid = ''.-gotvers = '0'.-gettid = ''.-getvers = '0'.ENDLOOP.
* Delete old records for dummy BW systemDELETE FROM roosprmscWHERE oltpsource = c_oltpsourceAND rlogsys = c_dlogsysAND slogsys = l_slogsys.
* Copy records for dummy BW systemMODIFY roosprmsc FROM TABLE l_t_roosprmsc.IF sy-subrc <> 0.MESSAGE e053(rj) WITH text-005.* Update of table ROOSPRMSC failedENDIF.* Filter values for delta initializationsSELECT * FROM roosprmsf INTO TABLE l_t_roosprmsfWHERE oltpsource = c_oltpsourceAND rlogsys = prlogsysAND slogsys = l_slogsys.IF sy-subrc <> 0.MESSAGE e020(rsqu).
* Some of the initialization requirements have not been completedENDIF.LOOP AT l_t_roosprmsf ASSIGNING .-rlogsys = c_dlogsys.ENDLOOP.* Delete old records for dummy BW systemDELETE FROM roosprmsfWHERE oltpsource = c_oltpsourceAND rlogsys = c_dlogsysAND slogsys = l_slogsys.* Copy records for dummy BW systemMODIFY roosprmsf FROM TABLE l_t_roosprmsf.IF sy-subrc <> 0.MESSAGE e053(rj) WITH text-006.* Update of table ROOSPRMSF failedENDIF.
COMMIT WORK for changed meta data
COMMIT WORK.* Delete RFC queue of dummy BW system* (Just in case entries of other delta requests exist)CALL FUNCTION 'RSC1_TRFC_QUEUE_DELETE_DATA'
i_osource = c_oltpsource
c_dlogsysi_all = 'X'
tid_not_executed = 1
tid_not_executed = 1
client_not_found = 3
error_reading_queue = 4
OTHERS = 5.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgnoWITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.ENDIF.
COMMIT WORK for deletion of delta queue
* Get MAXLINES for data package
CALL FUNCTION 'RSAP_IDOC_DETERMINE_PARAMETERS'
EXPORTINGi_oltpsource = c_oltpsourcei_slogsys = l_slogsysi_rlogsys = prlogsysi_updmode = 'D 'IMPORTINGe_s_parameters = l_s_parameterse_subrc = l_subrc.
.IF l_subrc <> 0.MESSAGE e053(rj) WITH text-007.* Error in function module RSAP_IDOC_DETERMINE_PARAMETERSENDIF.* Transfer structure depends on transfer methodCASE l_s_roosgen-tfmethode.WHEN 'I'.
l_tfstruc = l_s_roosgen-tfstridoc.WHEN 'T'.l_tfstruc = l_s_roosgen-tfstruc.ENDCASE.* Determine transfer structure field listPERFORM fill_field_list(saplrsap) TABLES l_t_fieldsUSING l_tfstruc.* Start the delta extraction for the dummy BW systemCALL FUNCTION 'RSFH_GET_DATA_SIMPLE'
EXPORTINGi_requnr = 'DUMMY'i_osource = c_oltpsourcei_showlist = ' 'i_maxsize = l_s_parameters-maxlinesi_maxfetch = '9999'i_updmode = 'D 'i_rlogsys = c_dlogsysi_read_only = ' 'IMPORTING
e_lines_read = l_lines_readTABLESi_t_field = l_t_fieldsEXCEPTIONSgeneration_error = 1interface_table_error = 2metadata_error = 3error_passed_to_mess_handler = 4no_authority = 5OTHERS = 6.IF sy-subrc <> 0.MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgnoWITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.ENDIF.
COMMIT WORK for delta request
** Delete RFC queue of dummy BW systemCALL FUNCTION 'RSC1_TRFC_QUEUE_DELETE_DATA'EXPORTINGi_osource = c_oltpsourcei_rlogsys = c_dlogsysi_all = 'X'EXCEPTIONStid_not_executed = 1invalid_parameter = 2client_not_found = 3error_reading_queue = 4OTHERS = 5.IF sy-subrc <> 0.MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgnoWITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.ENDIF.* Data collection for 0FI_GL_4 delta queue successfulMESSAGE s053(rj) WITH text-008.
1.create query with exceptions
2.use central alert framwork:
.create alert category
.create containor element
.create text for alert message
.Then you can enter a text and a URL for a subsequent activity (optionally). E.g. you can add a link to a BI Query which should be checked by the recipient in order to react to the alert.
.In the last step of the alert category configuration you have to assign the alert to the end users. You can enter fixed recipients or roles. If you enter a role, all users that are assigned to that role will get the alert. You can also enter roles, if you press the button "Subscription Authorization". In that case the assigned users will have the option to subscribe for the alert later.
.In the next step you have to call the BEx Broadcaster and create an Information broadcasting setting based on the query, on which the exception has been defined on. As distribution type you have to choose "Distribute according to exceptions". In the details you can either choose the distribution type "Send Email" or "Create Alert", if you want to distribute the alert via the Universal Worklist. As selection criterion you can either choose to distribute all exceptions or you can choose a specific alert level. In our example we only want to distribute alerts, which have the level "Bad 3".
.Then you have to assign the corresponding alert category you have created before to your Information broadcasting setting.
.In the next step you have to do the mapping between the BI parameters of the Query and the alert container elements. These parameters will then be passed over to the alert.
.In the last step you have to save the Information Broadcasting setting. You can execute the setting directly or you can schedule the execution e.g. periodically each week.
.As a result you will see 2 new alerts in the Universal Worklist for all users which have been assigned to the alert corresponding alert category. You can access the Universal Worklist in the Enterprise Portal: Business Intelligence Business Explorer Universal Worklist.
.default It is recommended to configure the DTP with upload mode “Delta”. The deletion of the PSA data is necessary before each data load, if a “Full” DTP is used. A Full DTP extracts all Requests from the PSA regardless if the data has been already loaded or not. This means the Delta upload via a DTP from the DataSource (PSA) in the InfoCube is necessary, even if the data is loaded via a Full upload from the Source to the DataSource (PSA) by using an InfoPackage. ( which means load from PSA via DTP will load all data from PSA no matter the data were loaded before or not,so eother PSA should delted after load ,or DTP use delta even laod form Data source to PSA use delta already)
.Only get Delta Once:
.Get Data by Request: get the oldest request
.Get runtime information of a Data Transfer Process (DTP) in a Transformation : I will give detail in another blog .
.Debug a Data Transfer Process (DTP) Request:The debugging expert mode can be started from the execute tab of the DTP. The “Expert Mode” flag appears when the Processing Mode “Serially in the Dialog Process (for Debugging)” is selected.Choose “Simulate” to start the Debugger in expert mode.The debugging for loaded data can be executed from the DTP Monitor directly.Choose “Debugging”.
Thursday, April 30, 2009
1. Fundamentals: What is NetWeaver? What is BI? What is SAP BW or Business Information Warehouse? What is SAP R3? Decision support in an Enterprise. Decision support v/s Operational Reporting. OLTP v/s OLAP. Fundamentals about working of SAP R3. Fundamentals aboutworking of SAP BW.
2. Functions of BW: Reporting (Decision Support and Operational), Open Hub (Supply Data to External Applications), Planning (Business Planning and Simulation – BPS). SEM–BPS is now BW-BPS. BW as EDW (Enterprise Data Warehouse).
3. Data Modeling: Data modeling concepts. Concepts behind various Data Models used in OLTP and OLAP. Why different Data Models? MDM v/s ERM. Extended Star Schema used in SAP BW.
4. SAP BW Terminology: Communication Structure in SAP R3, Extract Structure, User Exit, Transfer Structure, Datasource, Source System, PSA, Transfer rules, Communication Structure in BW, Update Rules, Infocube, ODS (Operational Data Store), Infoobject, Master Data(Attributes, Texts and Hierarchies), Characteristics and Key Figures, Infoprovider, Datatarget, Infoprovider v/s Datatarget, Infoarea, Application Components, Administrator Work Bench, Multiprovider, Infoset, Bex Query, Infoset Query, Classic Infoset.
5. Data Flow: How the data flows from Source System in to BW? Data Flow Diagram for SAP R3 OLTP System to SAP BW. Types of Updates – Direct v/s Flexible. How the elements described above are used in Data Flow?
6. Infoobject: How to create Infoobject? Types of Infoobjects. Characteristics and Key Figures, Master Data, Special types (Unit/Currency, Date), Data Structures in Infoobject. How to Load Data in to Infoobject? How is the Infoobject used in Reporting? Global Transfer Routine – How to use Global Transfer Routine? Why it is used. Management of overlapping Master Data fromMultiple Sources. Creating Direct Update Infosources Automatically. Infoobject as Infoprovider.
7. Types of Updates: Additive, Overwrite. Where to maintain update type for Datasource? Which objects use these update types (ODS/Infocube?Master Data)?
8. Infocube: How to create an Infocube? Types of Infocubes (Transactional, Basic, Remote/Virtual), Data Structures in an Infocube. Update types for Infocube. How is the data updated in the Infocube? Virtual Key Figures.
9. ODS Object: How to Create ODS Object? Structure of ODS Object. Updste types in ODS Object. Update Mechanism. Data structures in an ODS Object.
10. Infosource: Creating Infosource, Update and Transfer Rules, Update Routine, Transfer Routine, Start Routine, Start up Routine.
11. AWB – Administrator Work Bench: Functions of AWB – Modeling, Monitoring, Reporting Agent, Transport Connection, Documents, Business Content, Translation, Metadata Repository.
12. Transports: How transports work in BW. How to create Transports? Efficient ways to create Transports in Different Scenarios.
13. Business Content: Standard Business Content in R3 and BW. Transferring Datasources and Application Component Hierarchy in R3. Replication of Datasources. Activation of Business Content. How to Activate Business Content – various Scenarios.
14. Data Extraction Using Flat Files: How to generate Transfer Structure from Communication Structure? Loading Data using Flat File.
15. Data Extraction from SAP R3 – Data Collection: Transaction Processing in SAP R3. Update types in SAP R3. How “Delta” is managed? V3 Control. Direct and Queued Delta. Update Mechanism in SAP R3. Manipulation of Data in SAP R3. Transaction User Exits, LIS User Exits.BW User Exits.
16. Data Extraction from SAP R3 –Application Specific: Infrastructure needed for loading data – Datasource, LO-Cockpit Datasources, CO_PA, FI-SL, FI_Line Item Extraction, LIS Extraction.
17. LO-Cockpit Extraction: Demonstrate each step in data extraction by performing transactions in SAP R3.
18. Data Extraction from SAP R3 – Generic: How to Create Generic Datasource? Delta Management. Generic Delta.
19. Data Extraction from SAP R3 – Maintain Datasource: What is - Direct Access, Delta, Inversion, Selection, Field only…, Hide. How to Maintain Datasource?
20. Data Extraction from SAP R3 – Modifying Data: Manipulation of Transaction Data / Master Data / Texts / Hierarchies. How to program in User Exit? Concept of Project. Function Modules used for Data Manipulation.
21. Modify Datasource: Demo the process of modifying Master Data Source by adding additional field and filling it with Data in BW.
22. Data Extraction from XML Source: XML Integration, Creating Flat File Datasource and generating Myself Datasource. Create and Maintain “Delta Queue” in BW System.
23. DB Connect: Concepts.
24. Data Mart Interface: Extraction within BW System. ODS to ODS, ODS to Infocube and Infocube to Infocube Extraction.
25. Open Hub Services: Infospoke, Create and Schedule Infospoke. BW as Open Hub or Data Hub.
26. Performance Management: How to improve performance of Data Load and Query?
Partitioning. Indexes in Infocubes and ODS’s. Aggregates on Infocubes. Compression.
27. Production Support: Monitoring Jobs, Process Chains, Event Chains, Infopackage Groups, Creating and Triggering Events, Common Problems, How to Fix them.
28. BW Presentation – Queries: Bex Analyzer, Query Designer, Queries in standard Excel Front end, Functions, Formulas, Calculated and Restricted Key Figures, Tabular Display. Query Views. Exception Reporting.
29. Web Interface: Launching Queries in Web Front end. Building simple Web site for launching Queries.
30. Web Application Designer: Web Templates. Creating Web Template with Company Logo. Adding more than one Query in a Web Page.
31. Reporting Agent: Demo - Scheduling a Query with exceptions to run at certain time and send e-mail with the result as attachment.
32. Report to Report Interface: Also known as RRI or Query Jump. Jump from Aggregate to Detailed Query with selections from the Aggregate Query.