Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
None
-
None
-
None
Description
In the attached script, the resulting PersisentWrites for doutc1_agg & dWc1_agg end up having unknown block sizes, despite the input DAGs for those variables having known block sizes. Due to this, when we use MLContext and mark those variables as outputs, the PersistentWrites will be rewritten to TransientWrites, and the block sizes will remain unknown.
To run:
spark-submit $SYSTEMML_HOME/target/SystemML.jar -f scenario1.dml -explain recompile_hops
Attachments
Attachments
Issue Links
- relates to
-
SYSTEMDS-540 Deep Learning
- In Progress