top of page
Search

Automatic Function Module Usage Collection with ABAP, Python, and SAP Joule

ree

 Why This Matters

Basis teams often need to know:

  • Which BAPIs, RFCs, or custom Z* modules are heavily used?

  • Where are the performance bottlenecks?

  • How can we forecast licensing and transport decisions?

Traditionally, this requires manual exports from ST03N/STAD.We can automate it end-to-end using:

  • ABAP collector job → Extracts FM usage from ST03N/STAD.

  • Python service → Stores usage, runs analytics, and exposes APIs.

  • SAP Joule skill → Lets users ask: “Which Z modules spiked this week?”*


Solution Architecture

SAP ABAP Collector (Z_FM_USAGE_COLLECTOR)
        ↓ (JSON push)
Python REST API (Flask / FastAPI)
        ↓
SQLite / HANA Cloud
        ↓
SAP Joule (Custom Skill)

Step 1 – ABAP Collector Program

We use SAP’s workload collector FM SWNC_COLLECTOR_GET_AGGREGATES to pull RFC/BAPI usage.

REPORT z_fm_usage_collector.

TYPES: BEGIN OF ty_fm_usage,
         system_sid   TYPE swnchostname,
         client       TYPE mandt,
         fm_name      TYPE string,
         tasktype     TYPE string,
         period_start TYPE d,
         calls        TYPE i,
         avg_resp_ms  TYPE i,
         total_time_ms TYPE i,
         source       TYPE string,
       END OF ty_fm_usage.

DATA: lt_rfcsrvr TYPE STANDARD TABLE OF swncaggrfcsrvr,
      lt_usage   TYPE STANDARD TABLE OF ty_fm_usage,
      lv_json    TYPE string.

CALL FUNCTION 'SWNC_COLLECTOR_GET_AGGREGATES'
  EXPORTING
    component   = 'TOTAL'
    periodtype  = 'D'
    periodstrt  = sy-datum - 1
  TABLES
    rfcsrvr     = lt_rfcsrvr.

LOOP AT lt_rfcsrvr ASSIGNING FIELD-SYMBOL(<ls_rfc>).
  APPEND VALUE ty_fm_usage(
    system_sid   = sy-sysid
    client       = sy-mandt
    fm_name      = <ls_rfc>-rfcname
    tasktype     = <ls_rfc>-tasktype
    period_start = sy-datum - 1
    calls        = <ls_rfc>-calls
    avg_resp_ms  = <ls_rfc>-avgresp
    total_time_ms= <ls_rfc>-totalresp
    source       = 'ST03N'
  ) TO lt_usage.
ENDLOOP.

" Convert to JSON
CALL TRANSFORMATION id SOURCE usage = lt_usage RESULT XML lv_json.

" Push to Python API
DATA(lo_http) = cl_http_client=>create_by_url(
   EXPORTING url = 'http://<python-server>:5000/ingest' ).
lo_http->request->set_method( 'POST' ).
lo_http->request->set_header_field( name = 'Content-Type' value = 'application/json' ).
lo_http->request->set_cdata( lv_json ).
lo_http->send( ).
lo_http->receive( ).

📌 Schedule this report hourly with SM36.


Step 2 – Python REST API

We use Flask to ingest ABAP JSON and provide endpoints for analytics.

from flask import Flask, request, jsonify
import sqlite3
from datetime import datetime

app = Flask(__name__)
DB_FILE = "fm_usage.db"

def init_db():
    with sqlite3.connect(DB_FILE) as conn:
        c = conn.cursor()
        c.execute("""
        CREATE TABLE IF NOT EXISTS fm_usage (
            id INTEGER PRIMARY KEY AUTOINCREMENT,
            system_sid TEXT, client TEXT, fm_name TEXT,
            tasktype TEXT, period_start TEXT,
            calls INTEGER, avg_resp_ms INTEGER,
            total_time_ms INTEGER, source TEXT
        )
        """)
        conn.commit()
init_db()

@app.route("/ingest", methods=["POST"])
def ingest():
    data = request.get_json(force=True)
    with sqlite3.connect(DB_FILE) as conn:
        c = conn.cursor()
        for rec in data:
            c.execute("""
                INSERT INTO fm_usage (system_sid, client, fm_name, tasktype, period_start,
                                       calls, avg_resp_ms, total_time_ms, source)
                VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)
            """, (
                rec.get("system_sid"),
                rec.get("client"),
                rec.get("fm_name"),
                rec.get("tasktype"),
                rec.get("period_start", datetime.now().strftime("%Y-%m-%d")),
                rec.get("calls"),
                rec.get("avg_resp_ms"),
                rec.get("total_time_ms"),
                rec.get("source", "ABAP")
            ))
        conn.commit()
    return jsonify({"status": "ok", "records": len(data)})

@app.route("/top_modules", methods=["GET"])
def top_modules():
    system = request.args.get("system")
    pattern = request.args.get("pattern", "%")
    limit = int(request.args.get("limit", 10))
    with sqlite3.connect(DB_FILE) as conn:
        q = """
        SELECT fm_name, SUM(calls), AVG(avg_resp_ms), SUM(total_time_ms)
        FROM fm_usage
        WHERE fm_name LIKE ? AND system_sid = ?
        GROUP BY fm_name
        ORDER BY SUM(calls) DESC
        LIMIT ?
        """
        rows = conn.execute(q, (pattern, system, limit)).fetchall()
    return jsonify([
        {"fm_name": r[0], "calls": r[1], "avg_resp_ms": r[2], "total_time_ms": r[3]}
        for r in rows
    ])

if __name__ == "__main__":
    app.run(host="0.0.0.0", port=5000)

Step 3 – SAP Joule Skill

  1. In Joule Studio, create skill: FM Usage Insights.

  2. Connect it to the Python API (Destination).

  3. Define intents:

    • “Top 10 FMs in PRD last week” → /top_modules?system=PRD&limit=10

    • “List Z* modules in QA” → /top_modules?system=QA&pattern=Z%

  4. Render response as a table (FM Name, Calls, Avg Response, Total Time).


Step 4 – Optional Analytics

Add anomaly detection in Python:

import pandas as pd
from sklearn.ensemble import IsolationForest

def detect_anomalies():
    conn = sqlite3.connect(DB_FILE)
    df = pd.read_sql("SELECT fm_name, calls, avg_resp_ms FROM fm_usage", conn)
    model = IsolationForest(contamination=0.05, random_state=42)
    df['anomaly'] = model.fit_predict(df[['calls','avg_resp_ms']])
    return df[df['anomaly'] == -1]

Expose via /anomalies endpoint. Joule can then answer:->“Any anomalies in FM usage today?”


Key Takeaways

  • Continuous FM usage telemetry without manual ST03N exports.

  • Python layer provides analytics, anomaly detection, and REST APIs.

  • SAP Joule makes it accessible in plain language to Basis and Dev teams.





 
 
 

Comments


Featured Blog Post

bottom of page