To Logically decide on the next step in the process chain flow( IF-THEN-ELSE), we can use option of "Decision between multiple Alternatives" Process type.
Step 1 : Choose "Decision between multiple Alternatives " Process type.Drag it to process chain window.
Step 2: We get a popu-up.Create a new decision making variant.
Step 3: Define the formula for IF. We have got formula editor for help.
We can define as many formula's as we have we options for next step.
Step 4: Each of the above IF condition is marked against a event ( options ) and they can be used in the process chain to define the next step ( to the respective option )
Labels
- BW Architecture (1)
- BW Reporting (3)
- Data Loading issues (7)
- Datawarehousing (9)
- Extraction (3)
- General Maintainence (5)
- HANA (1)
Blog Archive
-
▼
2008
(21)
-
►
September
(17)
- Difference between LIS and LO Extraction
- What will happend if a request in Green is deleted?
- When is reconstruction allowed? Questions
- Handling Amount Values with currencies in BW
- PSA reverse posting
- Unable to Cancel Job in SM37 (R3)
- Attribute delta loading "duplicate record found"
- How to supress messages generated by BW Queries
- Dimension Size Vs Fact Size
- BW Main tables
- Production Support Issues in BW
- Selective Deletion in Process Chain
- How to Debugg Update and transfer Rules
- BW tables
- How to define F4 Order Help for infoobject for rep...
- TCURF, TCURR and TCURX
- Error loading master data - Data record 1 ('AB03...
-
►
September
(17)
23 Dec 2008
5 Nov 2008
DTP filer Routine
Following is the example of routine written in DTP filter.
While writing the logic of l_t_range-SIGN,l_t_range-OPTION,l_t_range-LOW etc. we should write l_t_range-Fieldname also here...
data: l_idx like sy-tabix.
read table l_t_range with key
fieldname = 'PLANT'.
l_idx = sy-tabix.
DATA : IT_PLANT LIKE TABLE OF /BI0/PPLANT.
data : it_plant_wa type /BI0/PPLANT.
DATA : IT_COUNTRY LIKE TABLE OF /BI0/PCOUNTRY.
CLEAR IT_COUNTRY.
CLEAR IT_PLANT.
SELECT * FROM /BI0/PCOUNTRY INTO TABLE IT_COUNTRY WHERE /BIC/ZMARKET2
= 'USA'.
IF IT_COUNTRY[] IS NOT INITIAL.
SELECT * FROM /BI0/PPLANT
INTO TABLE IT_PLANT
FOR ALL ENTRIES IN
IT_COUNTRY
WHERE COUNTRY = IT_COUNTRY-COUNTRY.
ENDIF.
LOOP AT IT_PLANT INTO IT_PLANT_WA.
l_t_range-low = IT_PLANT_WA-PLANT.
l_t_range-Fieldname = 'PLANT'.
l_t_range-SIGN = 'I'.
l_t_range-OPTION = 'EQ'.
append l_t_range.
ENDLOOP.
if l_idx <> 0.
modify l_t_range index l_idx.
else.
append l_t_range.
endif.
p_subrc = 0.
While writing the logic of l_t_range-SIGN,l_t_range-OPTION,l_t_range-LOW etc. we should write l_t_range-Fieldname also here...
data: l_idx like sy-tabix.
read table l_t_range with key
fieldname = 'PLANT'.
l_idx = sy-tabix.
DATA : IT_PLANT LIKE TABLE OF /BI0/PPLANT.
data : it_plant_wa type /BI0/PPLANT.
DATA : IT_COUNTRY LIKE TABLE OF /BI0/PCOUNTRY.
CLEAR IT_COUNTRY.
CLEAR IT_PLANT.
SELECT * FROM /BI0/PCOUNTRY INTO TABLE IT_COUNTRY WHERE /BIC/ZMARKET2
= 'USA'.
IF IT_COUNTRY[] IS NOT INITIAL.
SELECT * FROM /BI0/PPLANT
INTO TABLE IT_PLANT
FOR ALL ENTRIES IN
IT_COUNTRY
WHERE COUNTRY = IT_COUNTRY-COUNTRY.
ENDIF.
LOOP AT IT_PLANT INTO IT_PLANT_WA.
l_t_range-low = IT_PLANT_WA-PLANT.
l_t_range-Fieldname = 'PLANT'.
l_t_range-SIGN = 'I'.
l_t_range-OPTION = 'EQ'.
append l_t_range.
ENDLOOP.
if l_idx <> 0.
modify l_t_range index l_idx.
else.
append l_t_range.
endif.
p_subrc = 0.
Labels:
Datawarehousing
13 Oct 2008
Function module to make yellow request to RED
Use SE37, to execute the function module RSBM_GUI_CHANGE_USTATE.From the next screen, for I_REQUID enter that request ID and execute.From the next screen, select 'Status Erroneous' radiobutton and continue.This Function Module, change the status of request from Green / Yellow to RED.
Labels:
Data Loading issues
7 Oct 2008
ABAP routine in infopackage for Multiple Selections
Requirement : 0plant should be restricted in the infopackage level using routine picking only selective plants ( say under some country or region )
Following is the code, i tried and working fine.
Solution
data: l_idx like sy-tabix.
DATA : IT_PLANT LIKE TABLE OF /BI0/PPLANT.
data : it_plant_wa type /BI0/PPLANT.
read table l_t_range with key
fieldname = 'PLANT'.
l_idx = sy-tabix.
SELECT * FROM /BI0/PPLANT INTO TABLE IT_PLANT WHERE COUNTRY = 'BE'.
LOOP AT IT_PLANT INTO IT_PLANT_WA.
l_t_range-low = IT_PLANT_WA-PLANT.
l_t_range-SIGN = 'I'.
l_t_range-OPTION = 'EQ'.
append l_t_range.
ENDLOOP.
modify l_t_range index l_idx.
p_subrc = 0.
Following is the code, i tried and working fine.
Solution
data: l_idx like sy-tabix.
DATA : IT_PLANT LIKE TABLE OF /BI0/PPLANT.
data : it_plant_wa type /BI0/PPLANT.
read table l_t_range with key
fieldname = 'PLANT'.
l_idx = sy-tabix.
SELECT * FROM /BI0/PPLANT INTO TABLE IT_PLANT WHERE COUNTRY = 'BE'.
LOOP AT IT_PLANT INTO IT_PLANT_WA.
l_t_range-low = IT_PLANT_WA-PLANT.
l_t_range-SIGN = 'I'.
l_t_range-OPTION = 'EQ'.
append l_t_range.
ENDLOOP.
modify l_t_range index l_idx.
p_subrc = 0.
Labels:
Datawarehousing
30 Sept 2008
Difference between LIS and LO Extraction
Both (LIS & LO) extractors are used to extract logica data from source system. Now a days most of the clients using LO extractor insted of LIS extractor.
LIS is old technique through which we can load the data. Here we use typical delta and full up-loading techniques to get data from an logistics application, it uses V1 and V2 update, it means that it was a synchronous or asynchronous update from the application documents to LIS structures at the time when the application document was posted.
LO Cockpit is new technique i think from bw 3.0 (not sure) which uses V3 which is an update that you can schedule, so it does not update at the time when you process the application documents, but it will be posted at a later stage.
You have separate datasources for header level, item level and schedule line level available in LO, you can choose at which level you want to extract your data and switch off others, which results in reduced data.we do not have this flexibility with LIS structures
All most all LIS extractor is outdated becuse of it's disadvantages.
Differences :
1. LIS works on transparent table concept and LO’s works on Cluster table concept.
2. Each data source i.e 2lis_11_s264 is splitted in to multiple data sources like 2 lis_11_vahdr, 2lis_11_vaitm,2lis_11_vascl, with this we can give more detailed level of information to the end users.
3. LIS works under Push mechanism where as Lo works on PULL Mechanism.
4. when we r doing delta then two transparent tables will come into picture i.e SBIW1 and SBIW2 in LIS. Where as in Los structures called LBWQ (Extractor queue), SM13 (Updatequeue)and RSA3 (delta queue).
5.LO cock pit is BW CONTENT EXTRACTOR
where as LIS is CUSTOMER GENERATED EXTRACTOR..
6.LO cockpit is uses Readymade datasorce...
but LIS we need to cretae everything...
and not to forget,
LO cockpit supports V3 update(BACK GROUND SCHEDULING Jobs)
LIS does't supports V3 update mode... it supports only V1 (SYNCHRONUS)& V2ASYNCHRONUS UPDATE)
LIS is old technique through which we can load the data. Here we use typical delta and full up-loading techniques to get data from an logistics application, it uses V1 and V2 update, it means that it was a synchronous or asynchronous update from the application documents to LIS structures at the time when the application document was posted.
LO Cockpit is new technique i think from bw 3.0 (not sure) which uses V3 which is an update that you can schedule, so it does not update at the time when you process the application documents, but it will be posted at a later stage.
You have separate datasources for header level, item level and schedule line level available in LO, you can choose at which level you want to extract your data and switch off others, which results in reduced data.we do not have this flexibility with LIS structures
All most all LIS extractor is outdated becuse of it's disadvantages.
Differences :
1. LIS works on transparent table concept and LO’s works on Cluster table concept.
2. Each data source i.e 2lis_11_s264 is splitted in to multiple data sources like 2 lis_11_vahdr, 2lis_11_vaitm,2lis_11_vascl, with this we can give more detailed level of information to the end users.
3. LIS works under Push mechanism where as Lo works on PULL Mechanism.
4. when we r doing delta then two transparent tables will come into picture i.e SBIW1 and SBIW2 in LIS. Where as in Los structures called LBWQ (Extractor queue), SM13 (Updatequeue)and RSA3 (delta queue).
5.LO cock pit is BW CONTENT EXTRACTOR
where as LIS is CUSTOMER GENERATED EXTRACTOR..
6.LO cockpit is uses Readymade datasorce...
but LIS we need to cretae everything...
and not to forget,
LO cockpit supports V3 update(BACK GROUND SCHEDULING Jobs)
LIS does't supports V3 update mode... it supports only V1 (SYNCHRONUS)& V2ASYNCHRONUS UPDATE)
Labels:
Extraction
25 Sept 2008
What will happend if a request in Green is deleted?
Deleting green request is no harm. if you are loading via psa, you can go to tab 'reconstruction' and select the request and 'insert/reconstruct' to have them back.But,
For example you will need to repeat this delta load from the source system. If you delete the green request then you will not get these delta records from the source system.
Explanation :
when the request is green, the source system gets the message that the data sent was loaded successfully, so the next time the load (delta) is triggered, new records are sent.
If for some reason you need to repeat the same delta load from the source, then making the request red sends the message that the load was not successful, so do not discard these delta records.Delta queue in r/3 will keep until the next upload successfully performed in bw. The same records are then extracted into BW in the next requested delta load.
For example you will need to repeat this delta load from the source system. If you delete the green request then you will not get these delta records from the source system.
Explanation :
when the request is green, the source system gets the message that the data sent was loaded successfully, so the next time the load (delta) is triggered, new records are sent.
If for some reason you need to repeat the same delta load from the source, then making the request red sends the message that the load was not successful, so do not discard these delta records.Delta queue in r/3 will keep until the next upload successfully performed in bw. The same records are then extracted into BW in the next requested delta load.
Labels:
Data Loading issues
24 Sept 2008
When is reconstruction allowed? Questions
1. When a request is deleted in a ODS/Cube, will it be available under reconstruction.
Ans :Yes it will be available under reconstruction tab, only if the processing is through PSA
Note: This function is particularly useful if you are loading deltas, that is, data that you cannot request again from the source system
2. Should the request be turned red before it is deleted from the target so as to enable reconstruction
Ans :To enable reconstruction you may not need to make the request red, but to enable repeat of last delta you have to make the request red before you delete it.
3. If the request is deleted with its status green, does the request get deleted from reconstruction tab too
Ans :No, it wont get deleted from reconstruction tab
4. Does the behaviour of reconstruction and deletion differ when the target is differnet. ODS and Cube
Ans :Yes
Ans :Yes it will be available under reconstruction tab, only if the processing is through PSA
Note: This function is particularly useful if you are loading deltas, that is, data that you cannot request again from the source system
2. Should the request be turned red before it is deleted from the target so as to enable reconstruction
Ans :To enable reconstruction you may not need to make the request red, but to enable repeat of last delta you have to make the request red before you delete it.
3. If the request is deleted with its status green, does the request get deleted from reconstruction tab too
Ans :No, it wont get deleted from reconstruction tab
4. Does the behaviour of reconstruction and deletion differ when the target is differnet. ODS and Cube
Ans :Yes
Labels:
General Maintainence
21 Sept 2008
Handling Amount Values with currencies in BW
SAP stores amount values of different currencies with a fixed interpretation of having two decimal places.There are some currencies that do not work well with such two decimal place setting. Usually this is because for some currencies, a fraction of currency unit is meaningless. That is true for the Japanese Yen, the
Turkish Lira, and Korean Won and many other such currencies.
TCURX table
The table determines the number of decimal places in the output according to the currency key. If the contents of currency exist in table TCURX as currency key CURRKEY, the system sets the number of decimal places according to the entry CURRDEC in TCURX. Otherwise, it uses the default setting of two decimal places. That means that TCURX only has to list the exceptions with a number of decimal places
other than 2.
Lets observe the following steps where TCURX is refered.
• Step 1: Updating the amount values into BW data target
• Step 2: Displaying the amount values in the BW reports.
Step 1: Updating the amount values into BW data target
When we load the data into BW having amount fields in it, it checks the TCURX table for the currency entry.If the currency entry is present in TCURX table, it divides the amount value by 10** (2- (CURRDEC entry in TCURX table)).
Step 2: Displaying the amount values in the BW reports.
The exactly converse happens while displaying such values in the report output. The amount value will be multiplied by 10** (2- (CURRDEC entry in TCURX table)) while displaying in the report output.
Example
Note: The KRW (Korean WON) currency key is present in the TCURX table with the decimal place value as 0.
Step 1:
Whenever the amount 123 KRW is loaded in BW, it checks the TCURX table for the KRW currency.After finding the entry in TCURX table, it stores the amount value in target as 123 / {10** (2 – (0))} that is 1.23 KRW. (The amount is divided by 100)
For this to happen, we have to make a specific setting in the InfoPackage thru which we schedule the loads.The checkbox for “Currency Conversion for External Systs” should be ticked as shown below.
Step 2:
While displaying the same amount value in the report output, the exactly opposite will happen. That is the amount value 1.23 KRW will be multiplied by 100 and shown as 123 KRW (Amount is multiplied by 100) in the report output which is the initial value which came from the source system.
General Observations
In most of the cases, we miss step 1 as it need manual intervention (like ticking the checkbox for “Currency Conversion for External Systs”) whereas the step 2 is carried out by default and we get the wrong results in the report output. Even if amount is not divided while loading, the multiplication is carried out by default at the time of reporting. That is the reason why it is necessary to check whether the step 1 is being carried out successfully or not.
As the settings in the External Data tab of InfoPackage applies to the data which is coming from the external source only, the checkbox for “Currency Conversion for External Systs” works for the currencies which are coming from the source system directly. Hence if you write logic to pick up the currency (lookup etc.) in either
transfer rules or update rules, step 1 will not be carried out. Step 2 will be carried out by default thereby giving wrong results in the report output.
Note : This knowledge was got after i read :
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/600aa67b-c348-2b10-f690-aaa81d12ec6a
Turkish Lira, and Korean Won and many other such currencies.
TCURX table
The table determines the number of decimal places in the output according to the currency key. If the contents of currency exist in table TCURX as currency key CURRKEY, the system sets the number of decimal places according to the entry CURRDEC in TCURX. Otherwise, it uses the default setting of two decimal places. That means that TCURX only has to list the exceptions with a number of decimal places
other than 2.
Lets observe the following steps where TCURX is refered.
• Step 1: Updating the amount values into BW data target
• Step 2: Displaying the amount values in the BW reports.
Step 1: Updating the amount values into BW data target
When we load the data into BW having amount fields in it, it checks the TCURX table for the currency entry.If the currency entry is present in TCURX table, it divides the amount value by 10** (2- (CURRDEC entry in TCURX table)).
Step 2: Displaying the amount values in the BW reports.
The exactly converse happens while displaying such values in the report output. The amount value will be multiplied by 10** (2- (CURRDEC entry in TCURX table)) while displaying in the report output.
Example
Note: The KRW (Korean WON) currency key is present in the TCURX table with the decimal place value as 0.
Step 1:
Whenever the amount 123 KRW is loaded in BW, it checks the TCURX table for the KRW currency.After finding the entry in TCURX table, it stores the amount value in target as 123 / {10** (2 – (0))} that is 1.23 KRW. (The amount is divided by 100)
For this to happen, we have to make a specific setting in the InfoPackage thru which we schedule the loads.The checkbox for “Currency Conversion for External Systs” should be ticked as shown below.
Step 2:
While displaying the same amount value in the report output, the exactly opposite will happen. That is the amount value 1.23 KRW will be multiplied by 100 and shown as 123 KRW (Amount is multiplied by 100) in the report output which is the initial value which came from the source system.
General Observations
In most of the cases, we miss step 1 as it need manual intervention (like ticking the checkbox for “Currency Conversion for External Systs”) whereas the step 2 is carried out by default and we get the wrong results in the report output. Even if amount is not divided while loading, the multiplication is carried out by default at the time of reporting. That is the reason why it is necessary to check whether the step 1 is being carried out successfully or not.
As the settings in the External Data tab of InfoPackage applies to the data which is coming from the external source only, the checkbox for “Currency Conversion for External Systs” works for the currencies which are coming from the source system directly. Hence if you write logic to pick up the currency (lookup etc.) in either
transfer rules or update rules, step 1 will not be carried out. Step 2 will be carried out by default thereby giving wrong results in the report output.
Note : This knowledge was got after i read :
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/600aa67b-c348-2b10-f690-aaa81d12ec6a
20 Sept 2008
PSA reverse posting
When the data is loaded to BW Infocube and then it is compresed then u can't delete the data pertaining to this request in the cube . For this what u can do is if the request is loaded via PSA then u can click on the Request Reverse Posting option on the MOnitor screen of the particular Request . This will reverse the sign of the keyfigures loaded into the InfoCube for that particular request only ,so that it will make overall keyfigyure value in the cube for this particular request to 0
Reverse posting to be done by system @ Monitoring --> Scheduler --> Reverse posting --> Immediate & Save.
This will nullify the before request values by sending reverse values.
This can be done only if the loaded data is still present in PSA.
Reverse posting to be done by system @ Monitoring --> Scheduler --> Reverse posting --> Immediate & Save.
This will nullify the before request values by sending reverse values.
This can be done only if the loaded data is still present in PSA.
Labels:
Data Loading issues
19 Sept 2008
Unable to Cancel Job in SM37 (R3)
Question from sdn :
We got a situation :Unable to Kill the Background job (BI_REQ*).It shows "ACTIVE" and been running for more than 19,000 secs (2 days..) .When this job started ,the PROCESS ID was : 244277..But after a day,this PROCESS ID : 244277 has been allocated to some other R3 Program(Monitored in SM50).But the Background Job is still "ACTIVE".Tried killing the Job in SM37--->Cancel Active Job or Delete the Job) but ,when tried to "Cancel" the Job,we get a message : The job is NOT ACTIVE !!!. Then when tried to "DELETE" the job ,we get a message " Job is still ACTIVE"......!!!!
----->Since there is no Work Process in SM50,what is the alternative solution to Kill this Job..........???
(In BW ,there is no Update of data records,with the monitor showing
Answere :
Go to SM37 >> select the job >> Click on the JOB tab in the top >> Click on Check status............
Then it status will show cancel...........
Actually sometimes it happens job gets cancelled but staus shows active....then we need to do this..........actually job is already cancelled......
We got a situation :Unable to Kill the Background job (BI_REQ*).It shows "ACTIVE" and been running for more than 19,000 secs (2 days..) .When this job started ,the PROCESS ID was : 244277..But after a day,this PROCESS ID : 244277 has been allocated to some other R3 Program(Monitored in SM50).But the Background Job is still "ACTIVE".Tried killing the Job in SM37--->Cancel Active Job or Delete the Job) but ,when tried to "Cancel" the Job,we get a message : The job is NOT ACTIVE !!!. Then when tried to "DELETE" the job ,we get a message " Job is still ACTIVE"......!!!!
----->Since there is no Work Process in SM50,what is the alternative solution to Kill this Job..........???
(In BW ,there is no Update of data records,with the monitor showing
Answere :
Go to SM37 >> select the job >> Click on the JOB tab in the top >> Click on Check status............
Then it status will show cancel...........
Actually sometimes it happens job gets cancelled but staus shows active....then we need to do this..........actually job is already cancelled......
Labels:
Data Loading issues
17 Sept 2008
Attribute delta loading "duplicate record found"
Master data attribute loading error: "x duplicate record found. y recordings used in table z "
Fix:
Option in InfoPackage "Ignore Double Data Records"
OR
Make "Change Run" tool run for every upload.
OR
In start routine of Transfer or Update rules make a sort of record of the package and then delete adiacent duplicates.
Fix:
Option in InfoPackage "Ignore Double Data Records"
OR
Make "Change Run" tool run for every upload.
OR
In start routine of Transfer or Update rules make a sort of record of the package and then delete adiacent duplicates.
Labels:
Data Loading issues
How to supress messages generated by BW Queries
Standard Solution :You might be aware of a standard solution. In transaction RSRT, select your query and click on the "message" button. Now you can determine which messages for the chosen query are not to be shown to the user in the front-end.
Custom Solution:
Only selected messages can be suppressed using the standard solution. However, there's a clever way you can implement your own solution... and you don't need to modify the system for it!
All messages are collected using function RRMS_MESSAGE_HANDLING. So all you have to do is implement an enhancement at the start of this function module. Now it's easy. Code your own logic to check the input parameters like the message class and number and skip the remainder of the processing logic if you don't want this message to show up in the front-end.
FUNCTION rrms_message_handling.
Start
ENHANCEMENT 1 Z_CHECK_BIA.
* Filter BIA Message
if i_class = 'RSD_TREX' and i_type = 'W' and i_number = '136'
* just testing it.
* exit
end if.
ENHANCEMENT
End
IMPORTING
---------
---------
--------
EXCEPTIONS
Dummy
Custom Solution:
Only selected messages can be suppressed using the standard solution. However, there's a clever way you can implement your own solution... and you don't need to modify the system for it!
All messages are collected using function RRMS_MESSAGE_HANDLING. So all you have to do is implement an enhancement at the start of this function module. Now it's easy. Code your own logic to check the input parameters like the message class and number and skip the remainder of the processing logic if you don't want this message to show up in the front-end.
FUNCTION rrms_message_handling.
Start
ENHANCEMENT 1 Z_CHECK_BIA.
* Filter BIA Message
if i_class = 'RSD_TREX' and i_type = 'W' and i_number = '136'
* just testing it.
* exit
end if.
ENHANCEMENT
End
IMPORTING
---------
---------
--------
EXCEPTIONS
Dummy
Labels:
BW Reporting
Dimension Size Vs Fact Size
The current size of all dimensions can be monitored in relation to fact table by t-code se38 running report SAP_INFOCUBE_DESIGNS.Also,we can test the infocube design by RSRV tests.It gives out the dimension to fact ratio.
The ratio of a dimension should be less than 10% of the fact table.
In the report,
Dimension table looks like /BI[C/O]/D[xxx]
Fact table looks like /BI[C/0]/[E/F][xxx]
Use T-CODE LISTSCHEMA to show the different tables associated with a cube.
When a dimension grows very large in relation to the fact table, db optimizer can't choose efficient path to the data because the guideline of each dimension having less than 10 percent of the fact table's records has been violated.
The condition of having large data growth in a dimension is called degenerative dimension.To fix, move the characteristics to different dimensions. But can only be done when no data in the InfoCube.
Note : In case if you have requirement to include item level details in the cube, then may be the Dim to Fact size will obviously be more which you cant help it.But you can make the item charecterstic to be in a line item dimension in that case.Line item dimension is a dimension having only one charecterstic in it.In this case, Since there is only one charecterstic in the dimension, the fact table entry can directly link with the SID of the charecterstic without using any DIMid (Dimid in dimension table usually connects the SID of the charecterstic with the fact) .
Since link happens by ignoring dimension table ( not in real sense ) , this will have faster query performance.
The ratio of a dimension should be less than 10% of the fact table.
In the report,
Dimension table looks like /BI[C/O]/D[xxx]
Fact table looks like /BI[C/0]/[E/F][xxx]
Use T-CODE LISTSCHEMA to show the different tables associated with a cube.
When a dimension grows very large in relation to the fact table, db optimizer can't choose efficient path to the data because the guideline of each dimension having less than 10 percent of the fact table's records has been violated.
The condition of having large data growth in a dimension is called degenerative dimension.To fix, move the characteristics to different dimensions. But can only be done when no data in the InfoCube.
Note : In case if you have requirement to include item level details in the cube, then may be the Dim to Fact size will obviously be more which you cant help it.But you can make the item charecterstic to be in a line item dimension in that case.Line item dimension is a dimension having only one charecterstic in it.In this case, Since there is only one charecterstic in the dimension, the fact table entry can directly link with the SID of the charecterstic without using any DIMid (Dimid in dimension table usually connects the SID of the charecterstic with the fact) .
Since link happens by ignoring dimension table ( not in real sense ) , this will have faster query performance.
Labels:
Datawarehousing
BW Main tables
Extractor related tables:
ROOSOURCE - On source system R/3 server, filter by: OBJVERS = 'A'
Data source / DS type / delta type/ extract method (table or function module) / etc
RODELTAM - Delta type lookup table.
ROIDOCPRMS - Control parameters for data transfer from the source system, result of "SBIW - General setting - Maintain Control Parameters for Data Transfer" on OLTP system.
maxsize: Maximum size of a data packet in kilo bytes
STATFRQU: Frequency with which status Idocs are sent
MAXPROCS: Maximum number of parallel processes for data transfer
MAXLINES: Maximum Number of Lines in a DataPacket
MAXDPAKS: Maximum Number of Data Packages in a Delta Request
SLOGSYS: Source system
Query related tables:
RSZELTDIR: filter by: OBJVERS = 'A', DEFTP: REP - query, CKF - Calculated key figure
Reporting component elements, query, variable, structure, formula, etc
RSZELTTXT: Similar to RSZELTDIR. Texts of reporting component elements
To get a list of query elements built on that cube:
RSZELTXREF: filter by: OBJVERS = 'A', INFOCUBE= [cubename]
To get all queries of a cube:
RSRREPDIR: filter by: OBJVERS = 'A', INFOCUBE= [cubename]
To get query change status (version, last changed by, owner) of a cube:
RSZCOMPDIR: OBJVERS = 'A'
Workbooks related tables:
RSRWBINDEX List of binary large objects (Excel workbooks)
RSRWBINDEXT Titles of binary objects (Excel workbooks)
RSRWBSTORE Storage for binary large objects (Excel workbooks)
RSRWBTEMPLATE Assignment of Excel workbooks as personal templates
RSRWORKBOOK 'Where-used list' for reports in workbooks
Web templates tables:
RSZWOBJ Storage of the Web Objects
RSZWOBJTXT Texts for Templates/Items/Views
RSZWOBJXREF Structure of the BW Objects in a Template
RSZWTEMPLATE Header Table for BW HTML Templates
Data target loading/status tables:
rsreqdone, " Request-Data
rsseldone, " Selection for current Request
rsiccont, " Request posted to which InfoCube
rsdcube, " Directory of InfoCubes / InfoProvider
rsdcubet, " Texts for the InfoCubes
rsmonfact, " Fact table monitor
rsdodso, " Directory of all ODS Objects
rsdodsot, " Texts of ODS Objects
sscrfields. " Fields on selection screens
Tables holding charactoristics:
RSDCHABAS: fields
OBJVERS -> A = active; M=modified; D=delivered
(business content characteristics that have only D version and no A version means not activated yet)
TXTTABFL -> = x -> has text
ATTRIBFL -> = x -> has attribute
RODCHABAS: with fields TXTSHFL,TXTMDFL,TXTLGFL,ATTRIBFL
RSREQICODS. requests in ods
RSMONICTAB: all requests
Transfer Structures live in PSAPODSD
/BIC/B0000174000 Trannsfer Structure
Master Data lives in PSAPSTABD
/BIC/HXXXXXXX Hierarchy:XXXXXXXX
/BIC/IXXXXXXX SID Structure of hierarchies:
/BIC/JXXXXXXX Hierarchy intervals
/BIC/KXXXXXXX Conversion of hierarchy nodes - SID:
/BIC/PXXXXXXX Master data (time-independent):
/BIC/SXXXXXXX Master data IDs:
/BIC/TXXXXXXX Texts: Char.
/BIC/XXXXXXXX Attribute SID table:
Master Data views
/BIC/MXXXXXXX master data tables:
/BIC/RXXXXXXX View SIDs and values:
/BIC/ZXXXXXXX View hierarchy SIDs and nodes:
InfoCube Names in PSAPDIMD
/BIC/Dcube_name1 Dimension 1
......
/BIC/Dcube_nameA Dimension 10
/BIC/Dcube_nameB Dimension 11
/BIC/Dcube_nameC Dimension 12
/BIC/Dcube_nameD Dimension 13
/BIC/Dcube_nameP Data Packet
/BIC/Dcube_nameT Time
/BIC/Dcube_nameU Unit
PSAPFACTD
/BIC/Ecube_name Fact Table (inactive)
/BIC/Fcube_name Fact table (active)
ODS Table names (PSAPODSD)
BW3.5
/BIC/AXXXXXXX00 ODS object XXXXXXX : Actve records
/BIC/AXXXXXXX40 ODS object XXXXXXX : New records
/BIC/AXXXXXXX50 ODS object XXXXXXX : Change log
Previously:
/BIC/AXXXXXXX00 ODS object XXXXXXX : Actve records
/BIC/AXXXXXXX10 ODS object XXXXXXX : New records
T-code tables:
tstc -- table of transaction code, text and program name
tstct - t-code text
ROOSOURCE - On source system R/3 server, filter by: OBJVERS = 'A'
Data source / DS type / delta type/ extract method (table or function module) / etc
RODELTAM - Delta type lookup table.
ROIDOCPRMS - Control parameters for data transfer from the source system, result of "SBIW - General setting - Maintain Control Parameters for Data Transfer" on OLTP system.
maxsize: Maximum size of a data packet in kilo bytes
STATFRQU: Frequency with which status Idocs are sent
MAXPROCS: Maximum number of parallel processes for data transfer
MAXLINES: Maximum Number of Lines in a DataPacket
MAXDPAKS: Maximum Number of Data Packages in a Delta Request
SLOGSYS: Source system
Query related tables:
RSZELTDIR: filter by: OBJVERS = 'A', DEFTP: REP - query, CKF - Calculated key figure
Reporting component elements, query, variable, structure, formula, etc
RSZELTTXT: Similar to RSZELTDIR. Texts of reporting component elements
To get a list of query elements built on that cube:
RSZELTXREF: filter by: OBJVERS = 'A', INFOCUBE= [cubename]
To get all queries of a cube:
RSRREPDIR: filter by: OBJVERS = 'A', INFOCUBE= [cubename]
To get query change status (version, last changed by, owner) of a cube:
RSZCOMPDIR: OBJVERS = 'A'
Workbooks related tables:
RSRWBINDEX List of binary large objects (Excel workbooks)
RSRWBINDEXT Titles of binary objects (Excel workbooks)
RSRWBSTORE Storage for binary large objects (Excel workbooks)
RSRWBTEMPLATE Assignment of Excel workbooks as personal templates
RSRWORKBOOK 'Where-used list' for reports in workbooks
Web templates tables:
RSZWOBJ Storage of the Web Objects
RSZWOBJTXT Texts for Templates/Items/Views
RSZWOBJXREF Structure of the BW Objects in a Template
RSZWTEMPLATE Header Table for BW HTML Templates
Data target loading/status tables:
rsreqdone, " Request-Data
rsseldone, " Selection for current Request
rsiccont, " Request posted to which InfoCube
rsdcube, " Directory of InfoCubes / InfoProvider
rsdcubet, " Texts for the InfoCubes
rsmonfact, " Fact table monitor
rsdodso, " Directory of all ODS Objects
rsdodsot, " Texts of ODS Objects
sscrfields. " Fields on selection screens
Tables holding charactoristics:
RSDCHABAS: fields
OBJVERS -> A = active; M=modified; D=delivered
(business content characteristics that have only D version and no A version means not activated yet)
TXTTABFL -> = x -> has text
ATTRIBFL -> = x -> has attribute
RODCHABAS: with fields TXTSHFL,TXTMDFL,TXTLGFL,ATTRIBFL
RSREQICODS. requests in ods
RSMONICTAB: all requests
Transfer Structures live in PSAPODSD
/BIC/B0000174000 Trannsfer Structure
Master Data lives in PSAPSTABD
/BIC/HXXXXXXX Hierarchy:XXXXXXXX
/BIC/IXXXXXXX SID Structure of hierarchies:
/BIC/JXXXXXXX Hierarchy intervals
/BIC/KXXXXXXX Conversion of hierarchy nodes - SID:
/BIC/PXXXXXXX Master data (time-independent):
/BIC/SXXXXXXX Master data IDs:
/BIC/TXXXXXXX Texts: Char.
/BIC/XXXXXXXX Attribute SID table:
Master Data views
/BIC/MXXXXXXX master data tables:
/BIC/RXXXXXXX View SIDs and values:
/BIC/ZXXXXXXX View hierarchy SIDs and nodes:
InfoCube Names in PSAPDIMD
/BIC/Dcube_name1 Dimension 1
......
/BIC/Dcube_nameA Dimension 10
/BIC/Dcube_nameB Dimension 11
/BIC/Dcube_nameC Dimension 12
/BIC/Dcube_nameD Dimension 13
/BIC/Dcube_nameP Data Packet
/BIC/Dcube_nameT Time
/BIC/Dcube_nameU Unit
PSAPFACTD
/BIC/Ecube_name Fact Table (inactive)
/BIC/Fcube_name Fact table (active)
ODS Table names (PSAPODSD)
BW3.5
/BIC/AXXXXXXX00 ODS object XXXXXXX : Actve records
/BIC/AXXXXXXX40 ODS object XXXXXXX : New records
/BIC/AXXXXXXX50 ODS object XXXXXXX : Change log
Previously:
/BIC/AXXXXXXX00 ODS object XXXXXXX : Actve records
/BIC/AXXXXXXX10 ODS object XXXXXXX : New records
T-code tables:
tstc -- table of transaction code, text and program name
tstct - t-code text
Labels:
Datawarehousing
Production Support Issues in BW
Production Support Errors :
1) Invalid characters while loading: When you are loading data then you may get some special characters like @#$%...e.t.c.then BW will throw an error like Invalid characters then you need to go through this RSKC transaction and enter all the Invalid chars and execute. It will store this data in RSALLOWEDCHAR table. Then reload the data. You won't get any error because now these are eligible chars done by RSKC
2) IDOC Or TRFC Error:
We can see the following error at “Status” Screen:
Sending packages from OLTP to BW lead to errors
Diagnosis
No IDocs could be sent to the SAP BW using RFC.
System response
There are IDocs in the source system ALE outbox that did not arrive in the ALE inbox of the SAP BW.
Further analysis:
Check the TRFC log.
You can get to this log using the wizard or the menu path "Environment -> Transact. RFC -> In source system".
Removing errors:
If the TRFC is incorrect, check whether the source system is completely connected to the SAP BW. Check especially the authorizations of the background user in the source system.
Action to be taken:
If Source System connection is OK Reload the Data.
3)PROCESSING IS OVERDUE FOR PROCESSED IDOCs
Diagnosis
IDocs were found in the ALE inbox for Source System that is not updated.
Processing is overdue.
Error correction:
Attempt to process the IDocs manually. You can process the IDocs
manually using the Wizard or by selecting the IDocs with incorrect
status and processing them manually.
Analysis:
After looking at all the above error messages we find that the IDocs are found in the ALE inbox for Source System that are not Updated.
Action to be taken:
We can process the IDocs manually via RSMO -> Header Tab -> Click on Process manually
4) LOCK NOT SET FOR LOADING MASTER DATA ( TEXT / ATTRIBUE / HIERARCHY )
Diagnosis
User ALEREMOTE is preventing you from loading texts to characteristic
0COSTCENTER . The lock was set by a master data loading process with the
request number.
System response
For reasons of consistency, the system cannot allow the update to continue, and it has terminated the process.
Procedure
Wait until the process that is causing the lock is complete. You can call transaction SM12 to display a list of the locks.
If a process terminates, the locks that have been set by this process are reset automatically.
Analysis:
After looking at all the above error messages we find that the user is “Locked”.
Action to be taken:
Wait for sometime & try reloading the Master Data manually from Info-package at RSA1.
5) Flat File Loading Error
Detail Error Message
Diagnosis
Data records were marked as incorrect in the PSA.
System response
The data package was not updated.
Procedure
Correct the incorrect data records in the data package (for example by manually editing them in PSA maintenance). You can find the error message for each record in the PSA by double-clicking on the record status.
Analysis:
After looking at all the above error messages we find that the PSA contains incorrect record.
Action to be taken:
To resolve this issue there are two methods:-
i) We can rectify the data at the source system & then load the data.
ii) We can correct the incorrect record in the PSA & then upload the data into the data target from here.
6) Object requested is currently locked by user ALEREMOTE
Detail Error Message.
Diagnosis
An error occurred in BI while processing the data. The error is documented in an error message.Object requested is currently locked by user ALEREMOTE
Procedure
Look in the lock table to establish which user or transaction is using the requested lock (Tools -> Administration -> Monitor -> Lock entries).
Analysis:
After looking at all the above error messages we find that the Object is “Locked. This must have happened since there might be some other back ground process running
Action to Be taken :
Delete the error request. Wait for some time and Repeat the chain.
1) Invalid characters while loading: When you are loading data then you may get some special characters like @#$%...e.t.c.then BW will throw an error like Invalid characters then you need to go through this RSKC transaction and enter all the Invalid chars and execute. It will store this data in RSALLOWEDCHAR table. Then reload the data. You won't get any error because now these are eligible chars done by RSKC
2) IDOC Or TRFC Error:
We can see the following error at “Status” Screen:
Sending packages from OLTP to BW lead to errors
Diagnosis
No IDocs could be sent to the SAP BW using RFC.
System response
There are IDocs in the source system ALE outbox that did not arrive in the ALE inbox of the SAP BW.
Further analysis:
Check the TRFC log.
You can get to this log using the wizard or the menu path "Environment -> Transact. RFC -> In source system".
Removing errors:
If the TRFC is incorrect, check whether the source system is completely connected to the SAP BW. Check especially the authorizations of the background user in the source system.
Action to be taken:
If Source System connection is OK Reload the Data.
3)PROCESSING IS OVERDUE FOR PROCESSED IDOCs
Diagnosis
IDocs were found in the ALE inbox for Source System that is not updated.
Processing is overdue.
Error correction:
Attempt to process the IDocs manually. You can process the IDocs
manually using the Wizard or by selecting the IDocs with incorrect
status and processing them manually.
Analysis:
After looking at all the above error messages we find that the IDocs are found in the ALE inbox for Source System that are not Updated.
Action to be taken:
We can process the IDocs manually via RSMO -> Header Tab -> Click on Process manually
4) LOCK NOT SET FOR LOADING MASTER DATA ( TEXT / ATTRIBUE / HIERARCHY )
Diagnosis
User ALEREMOTE is preventing you from loading texts to characteristic
0COSTCENTER . The lock was set by a master data loading process with the
request number.
System response
For reasons of consistency, the system cannot allow the update to continue, and it has terminated the process.
Procedure
Wait until the process that is causing the lock is complete. You can call transaction SM12 to display a list of the locks.
If a process terminates, the locks that have been set by this process are reset automatically.
Analysis:
After looking at all the above error messages we find that the user is “Locked”.
Action to be taken:
Wait for sometime & try reloading the Master Data manually from Info-package at RSA1.
5) Flat File Loading Error
Detail Error Message
Diagnosis
Data records were marked as incorrect in the PSA.
System response
The data package was not updated.
Procedure
Correct the incorrect data records in the data package (for example by manually editing them in PSA maintenance). You can find the error message for each record in the PSA by double-clicking on the record status.
Analysis:
After looking at all the above error messages we find that the PSA contains incorrect record.
Action to be taken:
To resolve this issue there are two methods:-
i) We can rectify the data at the source system & then load the data.
ii) We can correct the incorrect record in the PSA & then upload the data into the data target from here.
6) Object requested is currently locked by user ALEREMOTE
Detail Error Message.
Diagnosis
An error occurred in BI while processing the data. The error is documented in an error message.Object requested is currently locked by user ALEREMOTE
Procedure
Look in the lock table to establish which user or transaction is using the requested lock (Tools -> Administration -> Monitor -> Lock entries).
Analysis:
After looking at all the above error messages we find that the Object is “Locked. This must have happened since there might be some other back ground process running
Action to Be taken :
Delete the error request. Wait for some time and Repeat the chain.
Labels:
Data Loading issues
Selective Deletion in Process Chain
The standard procedure :
Use Program RSDRD_DELETE_FACTS
1. Create a variant which is stored in the table RSDRBATCHPARA for the selection to be deleted from a data target.
2. Execute the generated program.
Observations:
The generated program executes will delete the data from data target based on the given selections. The program also removes the variant created for this selective deletion in the RSDRBATCHPARA table. So this generated program wont delete on the second execution.
If we want to use this program for scheduling in the process chain we can comment the step where the program remove the deletion of the generated variant.
Eg:
REPORT ZSEL_DELETE_QM_C10 .
TYPE-POOLS: RSDRD, RSDQ, RSSG.
DATA:
L_UID TYPE RSSG_UNI_IDC25,
L_T_MSG TYPE RS_T_MSG,
L_THX_SEL TYPE RSDRD_THX_SEL.
L_UID = 'D2OP7A6385IJRCKQCQP6W4CCW'.
IMPORT I_THX_SEL TO L_THX_SEL
FROM DATABASE RSDRBATCHPARA(DE) ID L_UID.
* DELETE FROM DATABASE RSDRBATCHPARA(DE) ID L_UID.
CALL FUNCTION 'RSDRD_SEL_DELETION'
EXPORTING
I_DATATARGET = '0QM_C10'
I_THX_SEL = L_THX_SEL
I_AUTHORITY_CHECK = 'X'
I_THRESHOLD = '1.0000E-01'
I_MODE = 'C'
I_NO_LOGGING = ''
I_PARALLEL_DEGREE = 1
I_NO_COMMIT = ''
I_WORK_ON_PARTITIONS = ''
I_REBUILD_BIA = ''
I_WRITE_APPLICATION_LOG = 'X'
CHANGING
C_T_MSG = L_T_MSG.
export l_t_msg to memory id sy-repid.
UPDATE RSDRBATCHREP
SET DELETEABLE = 'X'
WHERE REPID = 'ZSEL_DELETE_QM_C10'.
The variants are to be created for each data target and based in the set of selection on requirements.
If we wish to create separate program which explains the data to be deleted explicitly, we can hardcode the selection for deletion in the program.
Eg:
REPORT ZSEL_DELETE_ZCSAL_1_A00 .
TYPE-POOLS: RSDRD, RSDQ, RSSG.
DATA:
L_UID TYPE RSSG_UNI_IDC25,
L_T_MSG TYPE RS_T_MSG,
L_THX_SEL TYPE RSDRD_SX_SEL,
LT_THX_SEL TYPE RSDRD_THX_SEL.
DATA LT_TAB TYPE RSDRD_T_RANGE.
DATA WA_TAB TYPE RSDRD_S_RANGE.
**Entry for 0CO_AREA = S999**
CLEAR LT_TAB.
WA_TAB-SIGN = 'I'.
WA_TAB-OPTION = 'EQ'.
WA_TAB-LOW = 'S999'.
WA_TAB-KEYFL = 'X'.
APPEND WA_TAB TO LT_TAB.
L_THX_SEL-IOBJNM = '0CO_AREA'.
L_THX_SEL-T_RANGE[] = LT_TAB[].
INSERT L_THX_SEL INTO TABLE LT_THX_SEL.
**Entry for 0VERSION = A00**
CLEAR LT_TAB.
WA_TAB-SIGN = 'I'.
WA_TAB-OPTION = 'EQ'.
WA_TAB-LOW = 'A00'.
WA_TAB-KEYFL = 'X'.
APPEND WA_TAB TO LT_TAB.
L_THX_SEL-IOBJNM = '0VERSION'.
L_THX_SEL-T_RANGE[] = LT_TAB[].
INSERT L_THX_SEL INTO TABLE LT_THX_SEL.
**Entry for 0CALMONTH BT Current Month till year end
CLEAR LT_TAB.
WA_TAB-SIGN = 'I'.
WA_TAB-OPTION = 'BT'.
WA_TAB-LOW = SY-DATUM+0(6).
CONCATENATE SY-DATUM+0(4) '12' INTO WA_TAB-HIGH.
WA_TAB-KEYFL = 'X'.
APPEND WA_TAB TO LT_TAB.
L_THX_SEL-IOBJNM = '0CALMONTH'.
L_THX_SEL-T_RANGE[] = LT_TAB[].
INSERT L_THX_SEL INTO TABLE LT_THX_SEL.
CALL FUNCTION 'RSDRD_SEL_DELETION'
EXPORTING
I_DATATARGET = 'ZCSAL_1'
I_THX_SEL = LT_THX_SEL
I_AUTHORITY_CHECK = 'X'
I_THRESHOLD = '1.0000E-01'
I_MODE = 'C'
I_NO_LOGGING = ''
I_PARALLEL_DEGREE = 1
I_NO_COMMIT = ''
I_WORK_ON_PARTITIONS = ''
I_REBUILD_BIA = ''
I_WRITE_APPLICATION_LOG = 'X'
CHANGING
C_T_MSG = L_T_MSG.
export l_t_msg to memory id sy-repid.
*UPDATE RSDRBATCHREP
* SET DELETEABLE = 'X'
* WHERE REPID = 'ZSEL_DELETE_QM_C10'.
Use Program RSDRD_DELETE_FACTS
1. Create a variant which is stored in the table RSDRBATCHPARA for the selection to be deleted from a data target.
2. Execute the generated program.
Observations:
The generated program executes will delete the data from data target based on the given selections. The program also removes the variant created for this selective deletion in the RSDRBATCHPARA table. So this generated program wont delete on the second execution.
If we want to use this program for scheduling in the process chain we can comment the step where the program remove the deletion of the generated variant.
Eg:
REPORT ZSEL_DELETE_QM_C10 .
TYPE-POOLS: RSDRD, RSDQ, RSSG.
DATA:
L_UID TYPE RSSG_UNI_IDC25,
L_T_MSG TYPE RS_T_MSG,
L_THX_SEL TYPE RSDRD_THX_SEL.
L_UID = 'D2OP7A6385IJRCKQCQP6W4CCW'.
IMPORT I_THX_SEL TO L_THX_SEL
FROM DATABASE RSDRBATCHPARA(DE) ID L_UID.
* DELETE FROM DATABASE RSDRBATCHPARA(DE) ID L_UID.
CALL FUNCTION 'RSDRD_SEL_DELETION'
EXPORTING
I_DATATARGET = '0QM_C10'
I_THX_SEL = L_THX_SEL
I_AUTHORITY_CHECK = 'X'
I_THRESHOLD = '1.0000E-01'
I_MODE = 'C'
I_NO_LOGGING = ''
I_PARALLEL_DEGREE = 1
I_NO_COMMIT = ''
I_WORK_ON_PARTITIONS = ''
I_REBUILD_BIA = ''
I_WRITE_APPLICATION_LOG = 'X'
CHANGING
C_T_MSG = L_T_MSG.
export l_t_msg to memory id sy-repid.
UPDATE RSDRBATCHREP
SET DELETEABLE = 'X'
WHERE REPID = 'ZSEL_DELETE_QM_C10'.
The variants are to be created for each data target and based in the set of selection on requirements.
If we wish to create separate program which explains the data to be deleted explicitly, we can hardcode the selection for deletion in the program.
Eg:
REPORT ZSEL_DELETE_ZCSAL_1_A00 .
TYPE-POOLS: RSDRD, RSDQ, RSSG.
DATA:
L_UID TYPE RSSG_UNI_IDC25,
L_T_MSG TYPE RS_T_MSG,
L_THX_SEL TYPE RSDRD_SX_SEL,
LT_THX_SEL TYPE RSDRD_THX_SEL.
DATA LT_TAB TYPE RSDRD_T_RANGE.
DATA WA_TAB TYPE RSDRD_S_RANGE.
**Entry for 0CO_AREA = S999**
CLEAR LT_TAB.
WA_TAB-SIGN = 'I'.
WA_TAB-OPTION = 'EQ'.
WA_TAB-LOW = 'S999'.
WA_TAB-KEYFL = 'X'.
APPEND WA_TAB TO LT_TAB.
L_THX_SEL-IOBJNM = '0CO_AREA'.
L_THX_SEL-T_RANGE[] = LT_TAB[].
INSERT L_THX_SEL INTO TABLE LT_THX_SEL.
**Entry for 0VERSION = A00**
CLEAR LT_TAB.
WA_TAB-SIGN = 'I'.
WA_TAB-OPTION = 'EQ'.
WA_TAB-LOW = 'A00'.
WA_TAB-KEYFL = 'X'.
APPEND WA_TAB TO LT_TAB.
L_THX_SEL-IOBJNM = '0VERSION'.
L_THX_SEL-T_RANGE[] = LT_TAB[].
INSERT L_THX_SEL INTO TABLE LT_THX_SEL.
**Entry for 0CALMONTH BT Current Month till year end
CLEAR LT_TAB.
WA_TAB-SIGN = 'I'.
WA_TAB-OPTION = 'BT'.
WA_TAB-LOW = SY-DATUM+0(6).
CONCATENATE SY-DATUM+0(4) '12' INTO WA_TAB-HIGH.
WA_TAB-KEYFL = 'X'.
APPEND WA_TAB TO LT_TAB.
L_THX_SEL-IOBJNM = '0CALMONTH'.
L_THX_SEL-T_RANGE[] = LT_TAB[].
INSERT L_THX_SEL INTO TABLE LT_THX_SEL.
CALL FUNCTION 'RSDRD_SEL_DELETION'
EXPORTING
I_DATATARGET = 'ZCSAL_1'
I_THX_SEL = LT_THX_SEL
I_AUTHORITY_CHECK = 'X'
I_THRESHOLD = '1.0000E-01'
I_MODE = 'C'
I_NO_LOGGING = ''
I_PARALLEL_DEGREE = 1
I_NO_COMMIT = ''
I_WORK_ON_PARTITIONS = ''
I_REBUILD_BIA = ''
I_WRITE_APPLICATION_LOG = 'X'
CHANGING
C_T_MSG = L_T_MSG.
export l_t_msg to memory id sy-repid.
*UPDATE RSDRBATCHREP
* SET DELETEABLE = 'X'
* WHERE REPID = 'ZSEL_DELETE_QM_C10'.
Labels:
General Maintainence
16 Sept 2008
How to Debugg Update and transfer Rules
1. Go to the Monitor.
2. Select 'Details' tab.
3. Click the 'Processing'
4. Right click any Data Package.
5. select 'simulate update'
6. Tick the check boxes ' Activate debugging in transfer rules' and 'Activate debugging in update rules'.
7. Click 'Perform simulation'.
2. Select 'Details' tab.
3. Click the 'Processing'
4. Right click any Data Package.
5. select 'simulate update'
6. Tick the check boxes ' Activate debugging in transfer rules' and 'Activate debugging in update rules'.
7. Click 'Perform simulation'.
Labels:
General Maintainence
10 Sept 2008
BW tables
Custome Infoobjects Tabels:
/BIC/M -- View of Master data Tables
/BIC/P -- Master data Table, Time Independent attributes
/BIC/Q -- Master data Table, Time Dependent attributes
/BIC/X -- SID Table, Time Independent
/BIC/Y -- SID Tabel, Time Dependent
/BIC/T -- Text Table
/BIC/H -- Heirarchy Table
/BIC/K -- Heirarchy SID Table
Standard Infoobjects Tabels(Buss. Content):
Replace "C" with "0" in above tables.
Ex: /BI0/M -- View of Master data Tables
Standard InfoCUBE Tables :
/BI0/F -- Fact Table(Before Compression)
/BI0/E -- Fact Table(After Compression)
/BI0/P -- Dimension Table - Data Package
/BI0/T -- Dimension Table - Time
/BI0/U -- Dimension Table - Unit
/BI0/1, 2, 3, .......A,B,C,D : -- Dimension Tables
BW Tables:
BTCEVTJOB -- To check List of jobs waiting for events
ROOSOURCE -- Control parameters for Datasource
ROOSFIELD -- Control parameters for Datasource
ROOSPRMSC -- Control parameters for Datasource
ROOSPRMSF -- Control parameters for Datasource
-- More info @ ROOSOURCE weblog
RSOLTPSOURCE -- Replicate Table for OLTP source in BW
RSDMDELTA -- Datamart Delta Management
RSSDLINITSEL, RSSDLINITDEL
-- Last valid Initialization to an OLTP Source
RSUPDINFO -- Infocube to Infosource correlation
RSUPDDAT -- Update rules key figures
RSUPDENQ -- Removal of locks in the update rules
RSUPDFORM -- BW: Update Rules - Formulas - Checking Table
RSUPDINFO -- Update info (status and program)
RSUPDKEY -- Update rule: Key per key figure
RSUPDROUT -- Update rules - ABAP routine - check table
RSUPDSIMULD -- Table for saving simulation data update
RSUPDSIMULH -- Table for saving simulation data header information
RSDCUBEIOBJ -- Infoobjects per Infocube
RSIS -- Infosouce Info
RSUPDINFO -- Update Rules Info
RSTS -- Transfer Rules Info
RSKSFIELD -- Communication Structure fields
RSALLOWEDCHAR -- Special Characters Table(T Code : RSKC, To maintain)
RSDLPSEL -- Selection Table for fields scheduler(Infpak's)
RSDLPIO -- Log data packet no
RSMONICTAB -- Monitor, Data Targets(Infocube/ODS) Table, request related info
RSTSODS -- Operational data store for Transfer structure
RSZELTDIR -- Query Elements
RSZGLOBV -- BEx Variables
RXSELTXREF, RSCOMPDIR -- Reports/query relavent tables
RSCUSTV -- Query settings
RSDIOBJ -- Infoobjects
RSLDPSEL -- Selection table for fields scheduler(Info pak list)
RSMONIPTAB -- InfoPackage for the monitor
RSRWORKBOOK -- Workbooks & related query genunid's
RSRREPDIR -- Contains Genuin id, Rep Name, author, etc...
RSRINDEXT -- Workbook ID & Name
RSREQDONE -- Monitor: Saving of the QM entries
RSSELDONE -- Monitor : Selection for exected requests
RSLDTDONE -- Texts on the requeasted infopacks & groups
RSUICDONE -- BIW: Selection table for user-selection update Infocubes's
RSSDBATCH -- Table for Batch run scheduler
RSLDPDEL -- Selection table for deleting with full update scheduler
RSADMINSV -- RS Administration
RSSDLINIT -- Last Valid Initializations to an OLTP Source
BTCEVTJOB --To check event status(scheduled or not)
VARI -- ABAP Variant related Table
VARIDESC -- Selection Variants: Description
SMQ1 -- QRFC Monitor(Out Bound)
SM13 -- Update Records status
/BIC/M -- View of Master data Tables
/BIC/P -- Master data Table, Time Independent attributes
/BIC/Q -- Master data Table, Time Dependent attributes
/BIC/X -- SID Table, Time Independent
/BIC/Y -- SID Tabel, Time Dependent
/BIC/T -- Text Table
/BIC/H -- Heirarchy Table
/BIC/K -- Heirarchy SID Table
Standard Infoobjects Tabels(Buss. Content):
Replace "C" with "0" in above tables.
Ex: /BI0/M -- View of Master data Tables
Standard InfoCUBE Tables :
/BI0/F -- Fact Table(Before Compression)
/BI0/E -- Fact Table(After Compression)
/BI0/P -- Dimension Table - Data Package
/BI0/T -- Dimension Table - Time
/BI0/U -- Dimension Table - Unit
/BI0/1, 2, 3, .......A,B,C,D : -- Dimension Tables
BW Tables:
BTCEVTJOB -- To check List of jobs waiting for events
ROOSOURCE -- Control parameters for Datasource
ROOSFIELD -- Control parameters for Datasource
ROOSPRMSC -- Control parameters for Datasource
ROOSPRMSF -- Control parameters for Datasource
-- More info @ ROOSOURCE weblog
RSOLTPSOURCE -- Replicate Table for OLTP source in BW
RSDMDELTA -- Datamart Delta Management
RSSDLINITSEL, RSSDLINITDEL
-- Last valid Initialization to an OLTP Source
RSUPDINFO -- Infocube to Infosource correlation
RSUPDDAT -- Update rules key figures
RSUPDENQ -- Removal of locks in the update rules
RSUPDFORM -- BW: Update Rules - Formulas - Checking Table
RSUPDINFO -- Update info (status and program)
RSUPDKEY -- Update rule: Key per key figure
RSUPDROUT -- Update rules - ABAP routine - check table
RSUPDSIMULD -- Table for saving simulation data update
RSUPDSIMULH -- Table for saving simulation data header information
RSDCUBEIOBJ -- Infoobjects per Infocube
RSIS -- Infosouce Info
RSUPDINFO -- Update Rules Info
RSTS -- Transfer Rules Info
RSKSFIELD -- Communication Structure fields
RSALLOWEDCHAR -- Special Characters Table(T Code : RSKC, To maintain)
RSDLPSEL -- Selection Table for fields scheduler(Infpak's)
RSDLPIO -- Log data packet no
RSMONICTAB -- Monitor, Data Targets(Infocube/ODS) Table, request related info
RSTSODS -- Operational data store for Transfer structure
RSZELTDIR -- Query Elements
RSZGLOBV -- BEx Variables
RXSELTXREF, RSCOMPDIR -- Reports/query relavent tables
RSCUSTV -- Query settings
RSDIOBJ -- Infoobjects
RSLDPSEL -- Selection table for fields scheduler(Info pak list)
RSMONIPTAB -- InfoPackage for the monitor
RSRWORKBOOK -- Workbooks & related query genunid's
RSRREPDIR -- Contains Genuin id, Rep Name, author, etc...
RSRINDEXT -- Workbook ID & Name
RSREQDONE -- Monitor: Saving of the QM entries
RSSELDONE -- Monitor : Selection for exected requests
RSLDTDONE -- Texts on the requeasted infopacks & groups
RSUICDONE -- BIW: Selection table for user-selection update Infocubes's
RSSDBATCH -- Table for Batch run scheduler
RSLDPDEL -- Selection table for deleting with full update scheduler
RSADMINSV -- RS Administration
RSSDLINIT -- Last Valid Initializations to an OLTP Source
BTCEVTJOB --To check event status(scheduled or not)
VARI -- ABAP Variant related Table
VARIDESC -- Selection Variants: Description
SMQ1 -- QRFC Monitor(Out Bound)
SM13 -- Update Records status
Labels:
Datawarehousing
3 Sept 2008
How to define F4 Order Help for infoobject for reporting
Open attributes tab of infoobject definition.
In that you will observe column for F4 order help against each attribute of that infoobject like below :
This field defines whether and where the attribute should appear in the value help.
Valid values:
• 00: The attribute does not appear in the value help.
• 01: The attribute appears at the first position (to the left) in the value help.
• 02: The attribute appears at the second position in the value
help.
• 03: ......
• Altogether, only 40 fields are permitted in the input help. In addition to the attributes, the characteristic itsel, its texts, and the compounded characteristics are also generated in the input help. The total number of these fields cannot exceed 40.
So accordingly , the inofobjects are changed> Suppose if say for infobject 0vendor, if in case 0country ( which is an attribute of 0vendor) is not be shown in the F4 help of 0vendor , then mark 0 against the attribtue 0country in the infoobject definition of 0vendor.
In that you will observe column for F4 order help against each attribute of that infoobject like below :
This field defines whether and where the attribute should appear in the value help.
Valid values:
• 00: The attribute does not appear in the value help.
• 01: The attribute appears at the first position (to the left) in the value help.
• 02: The attribute appears at the second position in the value
help.
• 03: ......
• Altogether, only 40 fields are permitted in the input help. In addition to the attributes, the characteristic itsel, its texts, and the compounded characteristics are also generated in the input help. The total number of these fields cannot exceed 40.
So accordingly , the inofobjects are changed> Suppose if say for infobject 0vendor, if in case 0country ( which is an attribute of 0vendor) is not be shown in the F4 help of 0vendor , then mark 0 against the attribtue 0country in the infoobject definition of 0vendor.
Labels:
BW Reporting,
Datawarehousing
TCURF, TCURR and TCURX
TCURF is always used in reference to Exchange rate.( in case of currency translation ).For example, Say we want to convert fig's from FROM curr to TO curr at Daily avg rate (M) and we have an exchange rate as 2,642.34. Factors for this currency combination for M in TCURF are say 100,000:1.Now the effective exchange rate becomes 0.02642.
Question ( taken from sdn ):
can't we have an exchange rate of 0.02642 and not at all use the factors from TCURF table?.I suppose we have to still maintain factors as 1:1 in TCURF table if we are using exchange rate as 0.02642. am I right?. But why is this so?. Can't I get rid off TCURF.
What is the use of TCURF co-existing with TCURR.
Answer :
Normally it's used to allow you a greater precision in calaculations
ie 0.00011 with no factors gives a different result to
0.00111 with factor of 10:1
So basing on the above answer, TCURF allows greater precision in calculations.
Its factor shud be considered before considering exchange rate.
-------------------------------------------------------------------------------------
TCURR
TCURR table is generally used while we create currency conversion types.
The currency conversion types will refer to the entries in TCURR defined against each currency ( with time reference) and get the exchange rate factor from source currency to target currency.
-------------------------------------------------------------------------------------
TCURX
TCURX table is used to exactly define the correct number of decimal places for any currency. It shows effect in the BEx report output.
-------------------------------------------------------------------------------------
Question ( taken from sdn ):
can't we have an exchange rate of 0.02642 and not at all use the factors from TCURF table?.I suppose we have to still maintain factors as 1:1 in TCURF table if we are using exchange rate as 0.02642. am I right?. But why is this so?. Can't I get rid off TCURF.
What is the use of TCURF co-existing with TCURR.
Answer :
Normally it's used to allow you a greater precision in calaculations
ie 0.00011 with no factors gives a different result to
0.00111 with factor of 10:1
So basing on the above answer, TCURF allows greater precision in calculations.
Its factor shud be considered before considering exchange rate.
-------------------------------------------------------------------------------------
TCURR
TCURR table is generally used while we create currency conversion types.
The currency conversion types will refer to the entries in TCURR defined against each currency ( with time reference) and get the exchange rate factor from source currency to target currency.
-------------------------------------------------------------------------------------
TCURX
TCURX table is used to exactly define the correct number of decimal places for any currency. It shows effect in the BEx report output.
-------------------------------------------------------------------------------------
Labels:
Datawarehousing
Error loading master data - Data record 1 ('AB031005823') : Version 'AB031005823' is not valid
Problem
Created a flat file datasource for uploading master data.Data loaded fine upto PSA.Once the DTP which runs the transformation is scheduled, its ends up in error as below:
Solution
After refering to many links on sdn, i found that since the data is from an external file,the data will not be matching the SAP internal format. So it shud be followed that we mark "External" format option in the datasource ( in this case for Material ) and apply the conversion routine MATN1 as shown in the picture below:
Once the above changes are done, the load was successful.
Knowledge from SDN forums
Conversion takes place when converting the contents of a screen field from display format to SAP-internal format and vice versa and when outputting with the ABAP statement WRITE, depending on the data type of the field.
Check the info here:
http://help.sap.com/saphelp_nw04/helpdata/en/2b/e9a20d3347b340946c32331c96a64e/content.htm
http://help.sap.com/saphelp_nw04/helpdata/en/07/6de91f463a9b47b1fedb5be18699e7/content.htm
This fm ( MATN1) will add leading ZEROS to the material number because when u query on MAKT with MATNR as just 123 you wll not be getting any values, so u should use this conversion exit to add leading zeros.
Created a flat file datasource for uploading master data.Data loaded fine upto PSA.Once the DTP which runs the transformation is scheduled, its ends up in error as below:
Solution
After refering to many links on sdn, i found that since the data is from an external file,the data will not be matching the SAP internal format. So it shud be followed that we mark "External" format option in the datasource ( in this case for Material ) and apply the conversion routine MATN1 as shown in the picture below:
Once the above changes are done, the load was successful.
Knowledge from SDN forums
Conversion takes place when converting the contents of a screen field from display format to SAP-internal format and vice versa and when outputting with the ABAP statement WRITE, depending on the data type of the field.
Check the info here:
http://help.sap.com/saphelp_nw04/helpdata/en/2b/e9a20d3347b340946c32331c96a64e/content.htm
http://help.sap.com/saphelp_nw04/helpdata/en/07/6de91f463a9b47b1fedb5be18699e7/content.htm
This fm ( MATN1) will add leading ZEROS to the material number because when u query on MAKT with MATNR as just 123 you wll not be getting any values, so u should use this conversion exit to add leading zeros.
Labels:
Data Loading issues
Subscribe to:
Posts (Atom)