SQL ANSI 2011 states that in case of overflow during arithmetic operations, an exception should be thrown. This is what most of the SQL DBs do (eg. SQLServer, DB2). Hive currently returns NULL (as Spark does) but HIVE-18291 is open to be SQL compliant.
I propose to have a config option which allows to decide whether Spark should behave according to SQL standards or in the current way (ie. returning NULL).