Details

    • Type: Test Test
    • Status: Resolved
    • Priority: Trivial Trivial
    • Resolution: Incomplete
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: None
    • Labels:
      None

      Activity

      YoungYik created issue -
      Hide
      YoungYik added a comment -

      I have made a mistake in my query SQL, just forget this issue, sorry.

      Show
      YoungYik added a comment - I have made a mistake in my query SQL, just forget this issue, sorry.
      YoungYik made changes -
      Field Original Value New Value
      Summary set hive.exec.reducers.max=<number> is not working to view test
      Issue Type Bug [ 1 ] Test [ 6 ]
      Affects Version/s 0.6.0 [ 12314524 ]
      Environment Linux 2.6.18-128.el5 x86_64 GNU/Linux, JRE 1.6.0_14
      Priority Major [ 3 ] Trivial [ 5 ]
      Description I create two views from one table of different partitions, and join these two views in my query, it just use 1 reducer and the tasks stay in 82% for a long time, and then failed.

      So, I use set hive.exec.reducers.max=28 before the execution of query, when select on joined tables, it works, but still keep saying "Number of reduce tasks determined at compile time: 1" when on the two joined views:


      1)hive -e "create view view_1(uname, login) as select uname,'this30d' from userlist where domain='$domain' and year=$YYYY and month=$MM and day=$DD and type='all' and period_days=30;"

      2)hive -e "create view view_0(uname, login) as select uname,'last30d' from userlist where domain='$domain' and year=$YYYY and month=$MM and day=$DD and type='all' and period_days=30;"

      3)hive -e "set mapred.reduce.tasks=28; set; select v0.login,v1.login,count(*) from view_0 v0 full outer join view_1 v1 group by v0.login,v1.login; set;"

      then the output:

      Total MapReduce jobs = 2
      Launching Job 1 out of 2
      Number of reduce tasks determined at compile time: 1
      In order to change the average load for a reducer (in bytes):
        set hive.exec.reducers.bytes.per.reducer=<number>
      In order to limit the maximum number of reducers:
        set hive.exec.reducers.max=<number>
      In order to set a constant number of reducers:
        set mapred.reduce.tasks=<number>

      and at last, it just still keep using 1 reducer in this query.
      but if I use "set;" to print the environment, it tells me:

      mapred.reduce.tasks=28
      Component/s CLI [ 12313604 ]
      Carl Steinbach made changes -
      Status Open [ 1 ] Resolved [ 5 ]
      Resolution Incomplete [ 4 ]

        People

        • Assignee:
          Unassigned
          Reporter:
          YoungYik
        • Votes:
          0 Vote for this issue
          Watchers:
          0 Start watching this issue

          Dates

          • Created:
            Updated:
            Resolved:

            Development