Affects Version/s: 4.8.0
Fix Version/s: 4.13.0
I am testing some code using Phoenix Spark plug in to read a Phoenix table with a namespace prefix in the table name (the table is created as a phoenix table not a hbase table), but it returns an TableNotFoundException.
The table is obviously there because I can query it using plain phoenix sql through Squirrel. In addition, using spark sql to query it has no problem at all.
I am running on the HDP 2.5 platform, with phoenix 126.96.36.199.5.0.0-1245
The problem does not exist at all when I was running the same code on HDP 2.4 cluster, with phoenix 4.4.
Neither does the problem occur when I query a table without a namespace prefix in the DB table name, on HDP 2.5
The log is in the attached file: tableNoFound.txt
My testing code is also attached.
The weird thing is in the attached code, if I run testSpark alone it gives the above exception, but if I run the testJdbc first, and followed by testSpark, both of them work.
After changing to create table by using
create table ACME.ENDPOINT_STATUS
The phoenix-spark plug in seems working. I also find some weird behavior,
If I do both the following
create table ACME.ENDPOINT_STATUS ...
create table "ACME:ENDPOINT_STATUS" ...
Both table shows up in phoenix, the first one shows as Schema ACME, and table name ENDPOINT_STATUS, and the later on shows as scheme none, and table name ACME:ENDPOINT_STATUS.
However, in HBASE, I only see one table ACME:ENDPOINT_STATUS. In addition, upserts in the table ACME.ENDPOINT_STATUS show up in the other table, so is the other way around.