SCS和MOSEK解算器保持运行

2024-05-14 16:58:51 发布

您现在位置:Python中文网/ 问答频道 /正文

我的应用程序在相当长的一段时间内使用ECOS解算器,突然我们开始得到不可行的解,从而导致解算器错误。在网上查看了一些堆栈和建议后,我看到了针对MOSEK和SCS解算器的建议

我尝试将ECO替换为SCS和MOSEK解算器,但我的跑步从未结束。通常我的跑步会在2小时内结束,但在更换后,它会跑8小时左右,而且永远不会结束。请推荐我

下面是参数

'解算器':{'name':'MOSEK', “备份名称”:“SCS”, “详细”:正确, “最大值”:3505}

好心帮忙

错误日志:

作业因阶段失败而中止:阶段6.0中的任务1934失败4次,最近的失败:阶段6.0中的任务1934.3丢失(TID 5028,ip-10-219-208-218.ec2.internal,executor 1):org.apache.spark.api.python.python异常:回溯(最近一次调用): 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/model.py”,第262行 提高解算器误差 cvxpy.error.SolverError

在处理上述异常期间,发生了另一个异常:

回溯(最近一次呼叫最后一次): 文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/miniconda/envs/project/lib/python3.6/site packages/cvxpy/expressions/constants/constant.py”,第243行,在参考号附近 ev=SA_eigsh(西格玛) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site packages/cvxpy/expressions/constants/constants.py”,第238行,位于SA eigsh中 返回eigsh(A,k=1,sigma=sigma,返回特征向量=False) eigsh中的文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/miniconda/envs/project/lib/python3.6/site packages/scipy/sparse/linalg/eigen/arpack/arpack.py”,第1687行 params.iterate() 文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/miniconda/envs/project/lib/python3.6/site packages/scipy/sparse/linalg/eigen/arpack/arpack.py”,第571行,重复 self.\u提高\u没有\u收敛() 文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/miniconda/envs/project/lib/python3.6/site packages/scipy/sparse/linalg/eigen/arpack/arpack.py”,第377行 提高ArpackNoConvergence(msg%(num_iter,k_ok,self.k),ev,vec) scipy.sparse.linalg.eigen.arpack.arpack.NOCONVERCENCE:arpack错误-1:不收敛(361次迭代,0/1个特征向量收敛)

在处理上述异常期间,发生了另一个异常:

回溯(最近一次呼叫最后一次): 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/pyspark.zip/pyspark/worker.py”,主行377 过程() 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/pyspark.zip/pyspark/worker.py”,第372行,正在处理中 serializer.dump_流(func(拆分索引,迭代器),outfile) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/pyspark.zip/pyspark/serializers.py”,第400行,在转储流中 vs=列表(itertools.islice(迭代器,批处理)) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/pyspark.zip/pyspark/util.py”,第113行,在包装器中 返回f(*args,**kwargs) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000001/py_dependencies.zip/pyspark_scripts/spark_tf_pipeline.py”,第49行 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/tf_get_from_smu records.py”,第38行,在tf_get_from_smu_records.py中 数据点、当前日期字符串、参数) 文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/py\u”encies.zip/cat/tf/tf_model/tf_get_from_smu records.py”,tf_get_outputs_from_smu records第24行 模型\输出,\拟合\模型(ts\包装器,参数) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/fit_model.py”,第13行,在fit_model中 机器_model.fit() 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/machine_model.py”,第62行 自我。_-fit() 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/machine_model.py”,第120行 self.model.fit() 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/model.py”,第267行 self.\u fit(self.solver\u参数['backup\u name']) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/model.py”,第245行 feastol_inacc=tols['feastol_inacc']) 文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/miniconda/envs/project/lib/python3.6/site packages/cvxpy/problems/problems.py”,第401行,在solve中 返回solve_func(self,*args,**kwargs) 文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/miniconda/envs/project/lib/python3.6/site packages/cvxpy/problems/problems.py”,第818行,在“解决”中 self、data、warm_start、verbose、kwargs) 文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/miniconda/envs/project/lib/python3.6/site packages/cvxpy/reduces/solvers/solving\u chain.py”,第341行,通过数据求解 解算器\选择,问题。\解算器\缓存) 文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/miniconda/envs/project/lib/python3.6/site packages/cvxpy/reduces/solvers/conic\u solvers/cvxopt\u conif.py”,第162行,通过数据求解 如果自删除冗余行(数据)=s不可行: 删除冗余行中的文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/miniconda/envs/project/lib/python3.6/site packages/cvxpy/reduces/solvers/conic\u solvers/cvxopt\u conif.py”第286行 eig=极值附近的eig(克,参考=TOL) 文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/miniconda/envs/project/lib/python3.6/site packages/cvxpy/expressions/constants/constant.py”,第247行,在参考号附近 ev=SA_eigsh(西格玛) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site packages/cvxpy/expressions/constants/constants.py”,第238行,位于SA eigsh中 返回eigsh(A,k=1,sigma=sigma,返回特征向量=False) eigsh中的文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/miniconda/envs/project/lib/python3.6/site packages/scipy/sparse/linalg/eigen/arpack/arpack.py”,第1687行 params.iterate() 文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/miniconda/envs/project/lib/python3.6/site packages/scipy/sparse/linalg/eigen/arpack/arpack.py”,第571行,重复 self.\u提高\u没有\u收敛() 文件“envpath/appcache/application\u 1618545751422\u 0044/container\u 1618545751422\u 0044\u 02\u000002/miniconda/envs/project/lib/python3.6/site packages/scipy/sparse/linalg/eigen/arpack/arpack.py”,第377行 提高ArpackNoConvergence(msg%(num_iter,k_ok,self.k),ev,vec) scipy.sparse.linalg.eigen.arpack.arpack.NOCONVERCENCE:arpack错误-1:不收敛(361次迭代,0/1个特征向量收敛)

at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:456)
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRunner.scala:592)
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRunner.scala:575)
at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:410)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:227)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$3.apply(ShuffleExchangeExec.scala:283)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$3.apply(ShuffleExchangeExec.scala:252)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:858)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:858)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1405)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

驱动程序堆栈跟踪:


Tags: 文件pyorgmodelapplicationapachecontainertf

热门问题