mrjob 在 Hadoop 集群上使用 make_runner 时出现 --steps 错误
我正在尝试在Hadoop集群上以编程方式运行一个简单的单词计数示例,但我无法让代码正常工作。
在test_job.py中的任务:
from mrjob.job import MRJob
import re
WORD_RE = re.compile(r"[\w']+")
class MRWordFreqCount(MRJob):
def mapper(self, _, line):
for word in WORD_RE.findall(line):
yield word.lower(), 1
def combiner(self, word, counts):
yield word, sum(counts)
def reducer(self, word, counts):
yield word, sum(counts)
在mr_job_test.py中的运行器:
from test_jobs import MRWordFreqCount
def test_runner(in_args, input_dir):
tmp_output = []
args = in_args + input_dir
mr_job = MRWordFreqCount(args.split())
with mr_job.make_runner() as runner:
runner.run()
for line in runner.stream_output():
tmp_output = tmp_output + [line]
return tmp_output
if __name__ == '__main__':
input_dir = 'hdfs:///test_input/'
args = '-r hadoop '
print test_runner(args, input_dir)
我可以在本地运行这段代码(使用inline
选项),但是在Hadoop上我遇到了:
> Traceback (most recent call last): File "mr_job_tester.py", line 17,
> in <module>
> print test_runner(args, input_dir) File "mr_job_tester.py", line 8, in test_runner
> runner.run() File "/usr/local/lib/python2.7/dist-packages/mrjob/runner.py", line 458, in
> run
> self._run() File "/usr/local/lib/python2.7/dist-packages/mrjob/hadoop.py", line 239, in
> _run
> self._run_job_in_hadoop() File "/usr/local/lib/python2.7/dist-packages/mrjob/hadoop.py", line 295, in
> _run_job_in_hadoop
> for step_num in xrange(self._num_steps()): File "/usr/local/lib/python2.7/dist-packages/mrjob/runner.py", line 742, in
> _num_steps
> return len(self._get_steps()) File "/usr/local/lib/python2.7/dist-packages/mrjob/runner.py", line 721, in
> _get_steps
> raise ValueError("Bad --steps response: \n%s" % stdout) ValueError: Bad --steps response:
1 个回答
1
(根据这个) mrjob 提交作业文件并在 mapper 和 reducer 中远程执行的方式,要求作业声明文件中必须包含以下几行内容:
if __name__ == "__main__":
MRWordFreqCount.run()