Friday, 20 May 2016

RESTful Service JSON limitation?

Experiencing issues running large scale reports

I did not expect......


Situation: Need to create a Restful Service to retrieve the data for reporting purpose. Will this report contain large set of data? Possibly. Then you should continue reading. 


I had a similar requirement working on a project and thought there should not be any problems using JSON format that is returned by ORDS in my APEX 5 instance.

To be precise here I have to say that I needed to use PL/SQL cursor expressions in my query and I am assuming that might be root of the problem.

What happened is - client was expecting JSON to come out with 1060 entities but for some reason ORDS was returning only 560-ish. To be worse, when testing it I was continuously getting a valid JSON back so I never suspected I was loosing my data along the way.
After spending couple of days looking at this it is still not clear what is causing this issue of data being cut out.  

Could it be that ORACLE ORDS JSON has limitations when working with cursor expressions or does this applies to other methods too? If that is the case it would be great to know what these are.

Due to confidential policies I will not be able to show my final JSON file with all the data but to give you a feeling what was going on I created a demo one:

 {
        "elements": [{
            "filename": "OTCfile",
            "data": [{
                "program": [{
                    "prog_id": 234343434,
                    "prog_name": "A & M TOOL, INC.",
                    "addr": "PO BOX 309, NC 28754",
                    "c_name": "JACK C. JACKSON",
                    "c_id": 55,
                    "total_rows": 1,
                    "person": [{
                        "internal_id": 91659,
                        "internal_name": "JACK M. JACKSON",
                        "internal_deptid": "51411100",
                        "internal_status": "Other Veteran",
                        "apply_flag": "Y",
                        "create_date": "09/03/2013",
                        "start_date": "09/03/2013",
                        "people_count": 0,
                        "other_count": 1,
                        "vat": 1,
                        "apply_num": 1
                    }]
                }]
            }]
        }]
    }
Query used:
select
'file1' as "filename",
cursor(
select
cursor(
select
p.id as "prog_id",
p.name as "prog_name",
p.address || p.zip_code) as "addr",
c.fname || ' ' || c.lname as "c_name",
c.id as "c_id",
c.total_rows "total_rows",
cursor(select pp.internal_id as "internal_id",
pd.lname||pd.fname as "internal_name",
pp.ptrade_trad_id as "internal_deptid",
pp.internal_status as "internal_status",
pp.apply as "apply_flag",
to_char(pp.create_date,'mm/dd/yyyy') as "create_date",
to_char(pp.begin_date,'mm/dd/yyyy') as "start_date",
pp.other_count as "other_count",
pp.people_count as "people_count",
pp.vat as "vat",
pp.apply_num as "apply_num"
from aprogramperson pp
join apersondetails pd
on pd.internal_id = pp.internal_id
join terminations tr
on pp.terminate_rsn = tr.code
where pp.prog_id= p.id
and pp.type_of_appr in (1, 2, 3)
and (pp.apply = 'y' or pp.begin_date is not null)
order by pp.prog_id,
pd.lname,
pd.fname,
pd.mname
) as "person"
from programs p
inner join consultants c
on p.cons_id = c.id
order by p.name
) as "program"
from dual
) as "data"
from dual
You will notice this is nothing complex, I highlighted key elements in this JSON. 

Now imagine you have 1000+ programs and n related persons that are part of the program. You would start building on this example and soon things might not work as you expect it.  

My total was 1067 programs with rare occasion having 20-30 persons attached to it. Most programs would have less than 10 people inside. 

In my case things got bad when file outgrew these figures:

 


which presented only half of the set I needed to produce. 
Real question is why? And how to work around it.

Unfortunately till this day I still do not have an answer to it. But I decided to write this post to let people aware of this issue while working with ORDS, JSON and cursor expressions.

I would love to hear if anyone had similar issues and what was the cause of it? Please get in touch this is something that intrigues me so much. Don't you hate when things are left half done. 

I heard before that web services should not be used for large data but I never heard what this limit was and why. Most likely there are performance issues to consider here but for sake of testing and pushing the limits I wanted to get to the bottom of this.

Work around? The way things always work are by using standard PL/SQL package/procedure that would generate this JSON manually. Is that an ideal way, probably not but at least I got it to the point where all data was being returned.


Happy APEXing,
Lino


Update on 23/06/2016

Found excellent example from Lucas here that explains how I could replace my sql cursor expressions using with clause and LISTAGG which might help.





No comments:

Post a comment