How can I parse JSON string in PL/SQL
With 11.0.4 version (there is no 11.0.4 version, of course) you have at least two choices(apart from writing a parser yourself):
Depending on the version of RDBMS you are using, here are a couple of options:
First one: for Oracle 11.1.0.7
and up, install Apex 5 and use apex_json
package:
-- here I have 12.1.0.1 version with version 5 of apex installedcolumn ora_version format a21;column apex_version format a21;select (select version from v$instance) as ora_version , (select version_no from apex_release) as apex_version from dual;--drop table test_2;/* our test table */ create table test_2( c_a date, c_b date, c_c number, c_d number, c_e number);select * from test_2;declare l_json_doc clob; begin dbms_output.put_line('Parsing json...'); l_json_doc := '{"a":"01/01/2015","b":"31/12/2015", "c":"11111111111","d":"1111111111", "e":"1234567890"}'; apex_json.parse(l_json_doc); insert into test_2(c_a, c_b, c_c, c_d, c_e) values(apex_json.get_date(p_path=>'a', p_format=>'dd/mm/yyyy'), apex_json.get_date(p_path=>'b', p_format=>'dd/mm/yyyy'), to_number(apex_json.get_varchar2(p_path=>'c')), to_number(apex_json.get_varchar2(p_path=>'d')), to_number(apex_json.get_varchar2(p_path=>'e'))); commit; dbms_output.put_line('Done!');end;/column c_c format 99999999999;select to_char(c_a, 'dd/mm/yyyy') as c_a , to_char(c_b, 'dd/mm/yyyy') as c_b , c_c , c_d , c_e from test_2;
Result:
ORA_VERSION APEX_VERSION --------------------- ---------------------12.1.0.1.0 5.0.2.00.07 1 row selected.Table created.no rows selected.Parsing json...Done!PL/SQL procedure successfully completed.C_A C_B C_C C_D C_E---------- ---------- ------------ ---------- ----------01/01/2015 31/12/2015 11111111111 1111111111 12345678901 row selected.
Second one: Use opensource PL/JSON. Never used it before, so I'm taking this opportunity to try it out. It's quite similar to apex_json
.
declare l_json json; --json object l_json_doc clob;begin dbms_output.put_line('Parsing json...'); -- parsing is done upon object instantiation l_json_doc := '{"a":"01/01/2015","b":"31/12/2015", "c":"11111111111","d":"1111111111", "e":"1234567890"}'; l_json := json(l_json_doc); insert into test_2(c_a, c_b, c_c, c_d, c_e) values(to_date(l_json.get('a').get_string, 'dd-mm-yyyy'), to_date(l_json.get('b').get_string, 'dd-mm-yyyy'), to_number(l_json.get('c').get_string), to_number(l_json.get('d').get_string), to_number(l_json.get('e').get_string)); commit; dbms_output.put_line('Done!');end;column c_c format 99999999999;select to_char(c_a, 'dd/mm/yyyy') as c_a , to_char(c_b, 'dd/mm/yyyy') as c_b , c_c , c_d , c_e from test_2;
Result:
C_A C_B C_C C_D C_E---------- ---------- ------------ ---------- ----------01/01/2015 31/12/2015 11111111111 1111111111 123456789001/01/2015 31/12/2015 11111111111 1111111111 12345678902 rows selected.
Introduction of json_table()
in 12.1.0.2 release makes JSON parsing it a bit simpler(just for the sake of demonstration):
insert into test_2 select to_date(c_a, 'dd-mm-yyyy') , to_date(c_b, 'dd-mm-yyyy') , c_c , c_d , c_e from json_table('{"a":"01/01/2015", "b":"31/12/2015", "c":"11111111111", "d":"1111111111", "e":"1234567890"}' , '$' columns ( c_a varchar2(21) path '$.a', c_b varchar2(21) path '$.b', c_c varchar2(21) path '$.c', c_d varchar2(21) path '$.d', c_e varchar2(21) path '$.e' )) ;
result:
select * from test_2;C_A C_B C_C C_D C_E----------- ----------- ---------- ---------- ----------1/1/2015 12/31/2015 1111111111 1111111111 1234567890
Since you specified you don't want to use any JSON library, if the format is fixed you could coerce it into something you could parse as XML, starting with stripping the curly braces, replacing the colons with equals signs, and removing the double-quotes from the first part of each name/value pair:
select regexp_replace(regexp_replace(value, '(^{|}$)'), '^"(.*)":(".*")($|,)', '\1=\2', 1, 0, 'm')from tests_1;REGEXP_REPLACE(REGEXP_REPLACE(VALUE,'(^{|}$)'),'^"(.*)":(".*")($|,)','\1=\2',1,0--------------------------------------------------------------------------------a="01/01/2015"b="31/12/2015"c="11111111111"d="1111111111"e="1234567890"
which you can use as the attributes of a dummy XML node; convert that to XMLType and you can use XMLTable to extract the attributes:
select x.a, x.b, x.c, x.d, x.efrom tests_1 tcross join xmltable('/tmp' passing xmltype('<tmp ' ||regexp_replace(regexp_replace(value, '(^{|}$)'), '^"(.*)":(".*")($|,)', '\1=\2', 1, 0, 'm') || ' />') columns a varchar2(10) path '@a', b varchar2(10) path '@b', c number path '@c', d number path '@d', e number path '@e') x;A B C D E---------- ---------- ------------- ------------- -------------01/01/2015 31/12/2015 11111111111 1111111111 1234567890
Then you can convert the strings to dates during insert:
insert into test_2 (a, b, c, d, e)select to_date(x.a, 'DD/MM/YYYY'), to_date(x.b, 'DD/MM/YYYY'), x.c, x.d, x.efrom tests_1 tcross join xmltable('/tmp' passing xmltype('<tmp ' || regexp_replace(regexp_replace(value, '(^{|}$)'), '^"(.*)":(".*")($|,)', '\1=\2', 1, 0, 'm') || ' />') columns a varchar2(10) path '@a', b varchar2(10) path '@b', c number path '@c', d number path '@d', e number path '@e') x;select * from test_2;A B C D E---------- ---------- ------------- ------------- -------------2015-01-01 2015-12-31 11111111111 1111111111 1234567890
That will cope with some of the name/value pairs not being there, and you'll get nulls if that happens.
If all the pairs will always be there you could just tokenize the string and pull out the relevant parts:
select to_date(regexp_substr(value, '[^"]+', 1, 4), 'DD/MM/YYYY') as a, to_date(regexp_substr(value, '[^"]+', 1, 8), 'DD/MM/YYYY') as b, to_number(regexp_substr(value, '[^"]+', 1, 12)) as c, to_number(regexp_substr(value, '[^"]+', 1, 16)) as d, to_number(regexp_substr(value, '[^"]+', 1, 20)) as efrom tests_1;A B C D E---------- ---------- ------------- ------------- -------------2015-01-01 2015-12-31 11111111111 1111111111 1234567890
Oracle 12c supports JSON
if you have an existing table simply do
ALTER TABLE table1 ADD CONSTRAINT constraint_name CHECK (your_column IS json);SELECT t.your_column.id FROM table1 t;
Note that for some reason t
nickname is necessary there
Or complete example:
CREATE TABLE json_documents ( id RAW(16) NOT NULL, data CLOB, CONSTRAINT json_documents_pk PRIMARY KEY (id), CONSTRAINT json_documents_json_chk CHECK (data IS JSON));INSERT INTO json_documents (id, data)VALUES (SYS_GUID(), '{ "FirstName" : "John", "LastName" : "Doe", "Job" : "Clerk", "Address" : { "Street" : "99 My Street", "City" : "My City", "Country" : "UK", "Postcode" : "A12 34B" }, "ContactDetails" : { "Email" : "john.doe@example.com", "Phone" : "44 123 123456", "Twitter" : "@johndoe" }, "DateOfBirth" : "01-JAN-1980", "Active" : true }');SELECT a.data.FirstName, a.data.LastName, a.data.Address.Postcode AS Postcode, a.data.ContactDetails.Email AS EmailFROM json_documents a;FIRSTNAME LASTNAME POSTCODE EMAIL--------------- --------------- ---------- -------------------------Jayne Doe A12 34B jayne.doe@example.comJohn Doe A12 34B john.doe@example.com2 rows selected.
More info