Details
Description
Hello!
Seems that when defining CTEs and using them inside another CTE in Spark SQL, Spark thinks the inner call for the CTE is a table or view, which is not found and then it errors with `Table or view not found: <CTE name>`
Steps to reproduce
- `pip install pyspark==3.2.0` (also happens with 3.2.1)
- start pyspark console by typing `pyspark` in the terminal
- Try to run the following SQL with `spark.sql(sql)`
WITH mock_cte__users AS ( SELECT 1 AS id ), model_under_test AS ( WITH users AS ( SELECT * FROM mock_cte__users ) SELECT * FROM users ) SELECT * FROM model_under_test;
Spark will fail with
pyspark.sql.utils.AnalysisException: Table or view not found: mock_cte__users; line 8 pos 29;
I don't know if this is a regression or an expected behavior of the new 3.2.* versions. This fix introduced in 3.2.0 might be related: https://issues.apache.org/jira/browse/SPARK-36447