Uploaded image for project: 'Apache HAWQ'
  1. Apache HAWQ
  2. HAWQ-1548

Ambiguous message while logging hawq utilization

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Closed
    • Major
    • Resolution: Not A Problem
    • None
    • backlog
    • libyarn
    • None

    Description

      While YARN mode is enabled, resource broker logs two things -

      • YARN cluster total resource
      • HAWQ's total resource per node.

      Following messages are logged

      2017-11-11 23:21:40.944904 UTC,,,p549330,th900077856,,,,0,con4,,seg-10000,,,,,"LOG","00000","Resource manager YARN resource broker counted YARN cluster having total resource (1376256 MB, 168.000000 CORE).",,,,,,,0,,"resourcebroker_LIBYARN.c",776,
      
      2017-11-11 23:21:40.944921 UTC,,,p549330,th900077856,,,,0,con4,,seg-10000,,,,,"LOG","00000","Resource manager YARN resource broker counted HAWQ cluster now having (98304 MB, 12.000000 CORE) in a YARN cluster of total resource (1376256 MB, 168.000000 CORE).",,,,,,,0,,"resourcebroker_LIBYARN.c",785,
      

      The second message shown above is ambiguous, After reading the sentence below it looks like that complete Hawq cluster in whole has only 98304 MB and 12 cores. However according to the configuration it should be 98304 MB and 12 cores per segment server.

      Resource manager YARN resource broker counted HAWQ cluster now having (98304 MB, 12.000000 CORE) in a YARN cluster of total resource (1376256 MB, 168.000000 CORE).
      

      Either the wrong variables are printed or we can correct the message to represent that the resources logged are per node. As this can confuse the user into thinking that hawq cluster does not have enough resources.

      Attachments

        Issue Links

          Activity

            People

              wlin Wen Lin
              outofmemory Shubham Sharma
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: