Tuesday 26 August 2014

Getting the Buffer Quality metric from an Oracle System

The requirement was to try and get the “Buffer Quality” value out of BASIS transaction ST04. The technique for this will vary from system to system, as it depends on the underlying database.


The field behind this is DBUFF_QUALITY and ST04 is underpinned by program RSDB0004.

A quick look at this tells us that RSDB0004 is just a CASE Statement that filters the Database System, and calls corresponding transactions. The one I’m interested in is an Oracle Database:


Going into this, I did a search on “DBUFF_QUALITY” and found that it was held in function module

C_ORA_MM_GET_DATA_V10.

I ran this in SE37, without any parameters populated, and in the response, there was a structure called S_ST04N_DATA with parameter DBUFF_QUALITY which is what I was after.

Nice and Simple!              



Friday 22 August 2014

CSV Files with Commas in the fields....

Okay, so the initial requirement was to load a CSV file in to SAP and process the content. This all seemed fine, and I processed it as follows:

  CALL FUNCTION 'GUI_UPLOAD'
    EXPORTING
      filename                      = v_filename_string
       filetype                      = 'DAT'
    TABLES
      data_tab                      = itab.

    loop at itab.
      split itab-text at ',' into w_input-matnr w_input-desc w_input-date w_input-demand.
      append w_input to t_input.
   endloop.

But it became apparent that the users, saving data in Excel CSV, wanted to include commas in the field values (specifically the w_input-desc). Obviously the split just takes these as delimiters and messes up the demand and date values.

To get round this, instead of “GUI Upload”, we’re going to use


  CALL FUNCTION 'KCD_CSV_FILE_TO_INTERN_CONVERT'
    EXPORTING
      i_filename      = l_filename
      i_separator     = ','
    TABLES
      e_intern        = lt_csv
    EXCEPTIONS
      upload_csv      = 1
      upload_filetype = 2
      OTHERS          = 3.

This results in a slightly awkward table :

     Row  Col  Value
1    0001 0001 40253
2    0001 0002 Crispy Aromatic, Half Duck 20x32
3    0001 0003 14:32
4    0001 0004 40
5    0002 0001 40253
6    0002 0002 Crispy Aromatic, Half Duck 20x32
7    0002 0003 14:33
8    0002 0004 50


But I can loop round this, and create my t_input table from there…
"Tee up the t_input table.

lt_csv_bluff[] lt_csv[].
delete ADJACENT DUPLICATES FROM lt_csv_bluff COMPARING row.

loop at lt_csv_bluff into lv_csv_bluff.
  append w_input to t_input.
endloop.


loop at lt_csv into lv_csv.
  assign COMPONENT lv_csv-col of STRUCTURE w_input to <fs>.
  <fs> lv_csv-value.
  MODIFY t_input from w_input INDEX lv_csv-row.
endloop.


Thursday 21 August 2014

Infinite Loop for background debugging

I use this all the time for debugging SAP ABAP "background" operations. Sometimes you can't debug straight to a bit of code, as it's operated on by the server, not the a frontend session.
So, what I do, is use this simple bit of code:

 data: a, b.
 a = 'X'.
 do.
   if a = b.
     exit.
   endif.
 enddo.

Which, as you can probably tell, sets the program into an infinite loop, which can only be interrupted by debugging.

This is done in transaction SM51 - find the session with ( your / system ) name on it, and hit Program-Debug from the menu


At this point, you can then clear out the "a" variable, which then means that a = b, and so the loop gets exited. You can then debug through the background process.

There are other ways of doing it, but I've found this one to be the simplest and most reliable.

Don't forget to take it out again once you're finished, otherwise you'll paralyse the backend!