8000 [GHF] Add test for `GitHubPR.get_last_comment` by malfet · Pull Request #74570 · pytorch/pytorch · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

[GHF] Add test for GitHubPR.get_last_comment #74570

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

malfet
Copy link
Contributor
@malfet malfet commented Mar 22, 2022

Also, fix regression/typo introduced by #74549

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor
facebook-github-bot commented Mar 22, 2022

🔗 Helpful links

💊 CI failures summary and remediations

As of commit ae67e7c (more details on the Dr. CI page):


  • 9/10 failures introduced in this PR
  • 1/10 broken upstream at merge base ea56b9d on Mar 22 from 10:26am to 1:26pm

🕵️ 9 new failures recognized by patterns

The following CI failures do not appear to be due to upstream breakages:

See GitHub Actions build pull / win-vs2019-cpu-py3 / test (default, 1, 2, windows.4xlarge) (1/9)

Step: "Test" (full log | diagnosis details | 🔁 rerun)

2022-03-22T20:49:18.9893346Z FAIL [0.016s]: tes...k_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T20:49:18.9375461Z   test_type_check_reshape_false (fx.test_gradual_type.TypeCheckerTest) ... ok (0.000s)
2022-03-22T20:49:18.9410033Z   test_type_check_reshape_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.000s)
2022-03-22T20:49:18.9432890Z   test_type_check_symbolic_inferenceconv2D_maxpool2d_flatten (fx.test_gradual_type.TypeCheckerTest) ... skip (0.000s)
2022-03-22T20:49:18.9462041Z   test_type_check_transpose_False (fx.test_gradual_type.TypeCheckerTest) ... ok (0.016s)
2022-03-22T20:49:18.9495326Z   test_type_check_transpose_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.000s)
2022-03-22T20:49:18.9769355Z   test_type_maxpool2d_fully_static (fx.test_gradual_type.TypeCheckerTest) ... ok (0.031s)
2022-03-22T20:49:18.9809288Z   test_type_typechecl_maxpool2d_3dinput (fx.test_gradual_type.TypeCheckerTest) ... ok (0.000s)
2022-03-22T20:49:18.9889300Z   test_typecheck_basicblock (fx.test_gradual_type.TypeCheckerTest) ... ok (0.000s)
2022-03-22T20:49:18.9891889Z 
2022-03-22T20:49:18.9892352Z ======================================================================
2022-03-22T20:49:18.9893346Z FAIL [0.016s]: test_function_back_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T20:49:18.9894509Z Test backward compatibility for function signatures with
2022-03-22T20:49:18.9895333Z ----------------------------------------------------------------------
2022-03-22T20:49:18.9896168Z Traceback (most recent call last):
2022-03-22T20:49:18.9899277Z   File "test_fx.py", line 3674, in test_function_back_compat
2022-03-22T20:49:18.9900363Z     self.assertExpected('\n'.join(signature_strs), 'fx_backcompat_function_signatures')
2022-03-22T20:49:18.9901892Z AssertionError: "torc[5534 chars] None) -> Any\ntorch.fx.interpreter.Interprete[3375 chars]tch]" != "torc[5534 chars] None, enable_io_processing: bool = True) -> A[3410 chars]tch]"
2022-03-22T20:49:18.9903728Z   torch.fx._symbolic_trace.Tracer.__init__(self, autowrap_modules: Tuple[Callable] = (<module math>,), autowrap_functions: Tuple[Callable, ...] = (,), param_shapes_constant: bool = False) -> None
2022-03-22T20:49:18.9905537Z   torch.fx._symbolic_trace.Tracer.call_module(self, m: torch.nn.modules.module.Module, forward: Callable[..., Any], args: Tuple[Any, ...], kwargs: Dict[str, Any]) -> Any
2022-03-22T20:49:18.9906956Z   torch.fx._symbolic_trace.Tracer.create_arg(self, a: Any) -> 'Argument'
2022-03-22T20:49:18.9908305Z   torch.fx._symbolic_trace.Tracer.is_leaf_module(self, m: torch.nn.modules.module.Module, module_qualified_name: str) -> bool

See GitHub Actions build pull / linux-xenial-cuda11.3-py3.7-gcc7 / test (default, 1, 2, linux.4xlarge.nvidia.gpu) (2/9)

Step: "Test" (full log | diagnosis details | 🔁 rerun)

2022-03-22T19:31:05.1695375Z FAIL [0.013s]: tes...k_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T19:31:05.1074768Z   test_type_check_reshape_false (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:31:05.1110208Z   test_type_check_reshape_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:31:05.1132225Z   test_type_check_symbolic_inferenceconv2D_maxpool2d_flatten (fx.test_gradual_type.TypeCheckerTest) ... skip (0.002s)
2022-03-22T19:31:05.1164474Z   test_type_check_transpose_False (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:31:05.1202008Z   test_type_check_transpose_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.004s)
2022-03-22T19:31:05.1540162Z   test_type_maxpool2d_fully_static (fx.test_gradual_type.TypeCheckerTest) ... ok (0.034s)
2022-03-22T19:31:05.1584199Z   test_type_typechecl_maxpool2d_3dinput (fx.test_gradual_type.TypeCheckerTest) ... ok (0.004s)
2022-03-22T19:31:05.1693759Z   test_typecheck_basicblock (fx.test_gradual_type.TypeCheckerTest) ... ok (0.011s)
2022-03-22T19:31:05.1694333Z 
2022-03-22T19:31:05.1694577Z ======================================================================
2022-03-22T19:31:05.1695375Z FAIL [0.013s]: test_function_back_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T19:31:05.1700297Z Test backward compatibility for function signatures with
2022-03-22T19:31:05.1701371Z ----------------------------------------------------------------------
2022-03-22T19:31:05.1702057Z Traceback (most recent call last):
2022-03-22T19:31:05.1702702Z   File "test_fx.py", line 3674, in test_function_back_compat
2022-03-22T19:31:05.1703674Z     self.assertExpected('\n'.join(signature_strs), 'fx_backcompat_function_signatures')
2022-03-22T19:31:05.1705185Z AssertionError: "torc[5534 chars] None) -> Any\ntorch.fx.interpreter.Interprete[3375 chars]tch]" != "torc[5534 chars] None, enable_io_processing: bool = True) -> A[3410 chars]tch]"
2022-03-22T19:31:05.1707274Z   torch.fx._symbolic_trace.Tracer.__init__(self, autowrap_modules: Tuple[Callable] = (<module math>,), autowrap_functions: Tuple[Callable, ...] = (,), param_shapes_constant: bool = False) -> None
2022-03-22T19:31:05.1708720Z   torch.fx._symbolic_trace.Tracer.call_module(self, m: torch.nn.modules.module.Module, forward: Callable[..., Any], args: Tuple[Any, ...], kwargs: Dict[str, Any]) -> Any
2022-03-22T19:31:05.1709544Z   torch.fx._symbolic_trace.Tracer.create_arg(self, a: Any) -> 'Argument'
2022-03-22T19:31:05.1710612Z   torch.fx._symbolic_trace.Tracer.is_leaf_module(self, m: torch.nn.modules.module.Module, module_qualified_name: str) -> bool

See GitHub Actions build pull / linux-bionic-py3.7-clang9 / test (default, 2, 2, linux.2xlarge) (3/9)

Step: "Test" (full log | diagnosis details | 🔁 rerun)

2022-03-22T19:23:35.8436837Z FAIL [0.010s]: tes...k_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T19:23:35.7981427Z   test_type_check_reshape_false (fx.test_gradual_type.TypeCheckerTest) ... ok (0.002s)
2022-03-22T19:23:35.8009448Z   test_type_check_reshape_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:23:35.8026738Z   test_type_check_symbolic_inferenceconv2D_maxpool2d_flatten (fx.test_gradual_type.TypeCheckerTest) ... skip (0.002s)
2022-03-22T19:23:35.8054180Z   test_type_check_transpose_False (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:23:35.8085347Z   test_type_check_transpose_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:23:35.8323347Z   test_type_maxpool2d_fully_static (fx.test_gradual_type.TypeCheckerTest) ... ok (0.024s)
2022-03-22T19:23:35.8360547Z   test_type_typechecl_maxpool2d_3dinput (fx.test_gradual_type.TypeCheckerTest) ... ok (0.004s)
2022-03-22T19:23:35.8430874Z   test_typecheck_basicblock (fx.test_gradual_type.TypeCheckerTest) ... ok (0.007s)
2022-03-22T19:23:35.8436154Z 
2022-03-22T19:23:35.8436387Z ======================================================================
2022-03-22T19:23:35.8436837Z FAIL [0.010s]: test_function_back_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T19:23:35.8437647Z Test backward compatibility for function signatures with
2022-03-22T19:23:35.8438327Z ----------------------------------------------------------------------
2022-03-22T19:23:35.8442605Z Traceback (most recent call last):
2022-03-22T19:23:35.8443100Z   File "test_fx.py", line 3674, in test_function_back_compat
2022-03-22T19:23:35.8443769Z     self.assertExpected('\n'.join(signature_strs), 'fx_backcompat_function_signatures')
2022-03-22T19:23:35.8444705Z AssertionError: "torc[5534 chars] None) -> Any\ntorch.fx.interpreter.Interprete[3375 chars]tch]" != "torc[5534 chars] None, enable_io_processing: bool = True) -> A[3410 chars]tch]"
2022-03-22T19:23:35.8445766Z   torch.fx._symbolic_trace.Tracer.__init__(self, autowrap_modules: Tuple[Callable] = (<module math>,), autowrap_functions: Tuple[Callable, ...] = (,), param_shapes_constant: bool = False) -> None
2022-03-22T19:23:35.8446867Z   torch.fx._symbolic_trace.Tracer.call_module(self, m: torch.nn.modules.module.Module, forward: Callable[..., Any], args: Tuple[Any, ...], kwargs: Dict[str, Any]) -> Any
2022-03-22T19:23:35.8447602Z   torch.fx._symbolic_trace.Tracer.create_arg(self, a: Any) -> 'Argument'
2022-03-22T19:23:35.8448380Z   torch.fx._symbolic_trace.Tracer.is_leaf_module(self, m: torch.nn.modules.module.Module, module_qualified_name: str) -> bool

See GitHub Actions build pull / linux-xenial-py3.7-gcc5.4 / test (default, 1, 2, linux.2xlarge) (4/9)

Step: "Test" (full log | diagnosis details | 🔁 rerun)

2022-03-22T19:21:57.7248590Z FAIL [0.009s]: tes...k_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T19:21:57.6805923Z   test_type_check_reshape_false (fx.test_gradual_type.TypeCheckerTest) ... ok (0.002s)
2022-03-22T19:21:57.6833204Z   test_type_check_reshape_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:21:57.6849849Z   test_type_check_symbolic_inferenceconv2D_maxpool2d_flatten (fx.test_gradual_type.TypeCheckerTest) ... skip (0.002s)
2022-03-22T19:21:57.6876075Z   test_type_check_transpose_False (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:21:57.6906472Z   test_type_check_transpose_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:21:57.7139772Z   test_type_maxpool2d_fully_static (fx.test_gradual_type.TypeCheckerTest) ... ok (0.023s)
2022-03-22T19:21:57.7173878Z   test_type_typechecl_maxpool2d_3dinput (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:21:57.7246146Z   test_typecheck_basicblock (fx.test_gradual_type.TypeCheckerTest) ... ok (0.007s)
2022-03-22T19:21:57.7246453Z 
2022-03-22T19:21:57.7248164Z ======================================================================
2022-03-22T19:21:57.7248590Z FAIL [0.009s]: test_function_back_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T19:21:57.7249080Z Test backward compatibility for function signatures with
2022-03-22T19:21:57.7249752Z ----------------------------------------------------------------------
2022-03-22T19:21:57.7252445Z Traceback (most recent call last):
2022-03-22T19:21:57.7252824Z   File "test_fx.py", line 3674, in test_function_back_compat
2022-03-22T19:21:57.7253607Z     self.assertExpected('\n'.join(signature_strs), 'fx_backcompat_function_signatures')
2022-03-22T19:21:57.7254414Z AssertionError: "torc[5534 chars] None) -> Any\ntorch.fx.interpreter.Interprete[3375 chars]tch]" != "torc[5534 chars] None, enable_io_processing: bool = True) -> A[3410 chars]tch]"
2022-03-22T19:21:57.7255333Z   torch.fx._symbolic_trace.Tracer.__init__(self, autowrap_modules: Tuple[Callable] = (<module math>,), autowrap_functions: Tuple[Callable, ...] = (,), param_shapes_constant: bool = False) -> None
2022-03-22T19:21:57.7256205Z   torch.fx._symbolic_trace.Tracer.call_module(self, m: torch.nn.modules.module.Module, forward: Callable[..., Any], args: Tuple[Any, ...], kwargs: Dict[str, Any]) -> Any
2022-03-22T19:21:57.7256812Z   torch.fx._symbolic_trace.Tracer.create_arg(self, a: Any) -> 'Argument'
2022-03-22T19:21:57.7257469Z   torch.fx._symbolic_trace.Tracer.is_leaf_module(self, m: torch.nn.modules.module.Module, module_qualified_name: str) -> bool

See GitHub Actions build pull / linux-xenial-py3.7-clang7-asan / test (default, 2, 3, linux.2xlarge) (5/9)

Step: "Test" (full log | diagnosis details | 🔁 rerun)

2022-03-22T19:30:31.1994121Z FAIL [0.015s]: tes...k_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T19:30:31.1511036Z   test_type_check_reshape_false (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:30:31.1540938Z   test_type_check_reshape_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:30:31.1559760Z   test_type_check_symbolic_inferenceconv2D_maxpool2d_flatten (fx.test_gradual_type.TypeCheckerTest) ... skip (0.002s)
2022-03-22T19:30:31.1585220Z   test_type_check_transpose_False (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:30:31.1615802Z   test_type_check_transpose_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:30:31.1877578Z   test_type_maxpool2d_fully_static (fx.test_gradual_type.TypeCheckerTest) ... ok (0.026s)
2022-03-22T19:30:31.1913728Z   test_type_typechecl_maxpool2d_3dinput (fx.test_gradual_type.TypeCheckerTest) ... ok (0.004s)
2022-03-22T19:30:31.1992958Z   test_typecheck_basicblock (fx.test_gradual_type.TypeCheckerTest) ... ok (0.008s)
2022-03-22T19:30:31.1993295Z 
2022-03-22T19:30:31.1993444Z ======================================================================
2022-03-22T19:30:31.1994121Z FAIL [0.015s]: test_function_back_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T19:30:31.1994583Z Test backward compatibility for function signatures with
2022-03-22T19:30:31.1995593Z ----------------------------------------------------------------------
2022-03-22T19:30:31.2007457Z Traceback (most recent call last):
2022-03-22T19:30:31.2036498Z   File "test_fx.py", line 3674, in test_function_back_compat
2022-03-22T19:30:31.2037266Z     self.assertExpected('\n'.join(signature_strs), 'fx_backcompat_function_signatures')
2022-03-22T19:30:31.2038247Z AssertionError: "torc[5534 chars] None) -> Any\ntorch.fx.interpreter.Interprete[3375 chars]tch]" != "torc[5534 chars] None, enable_io_processing: bool = True) -> A[3410 chars]tch]"
2022-03-22T19:30:31.2039568Z   torch.fx._symbolic_trace.Tracer.__init__(self, autowrap_modules: Tuple[Callable] = (<module math>,), autowrap_functions: Tuple[Callable, ...] = (,), param_shapes_constant: bool = False) -> None
2022-03-22T19:30:31.2040629Z   torch.fx._symbolic_trace.Tracer.call_module(self, m: torch.nn.modules.module.Module, forward: Callable[..., Any], args: Tuple[Any, ...], kwargs: Dict[str, Any]) -> Any
2022-03-22T19:30:31.2041391Z   torch.fx._symbolic_trace.Tracer.create_arg(self, a: Any) -> 'Argument'
2022-03-22T19:30:31.2045632Z   torch.fx._symbolic_trace.Tracer.is_leaf_module(self, m: torch.nn.modules.module.Module, module_qualified_name: str) -> bool

See GitHub Actions build pull / linux-bionic-py3.7-clang9 / test (noarch, 1, 1, linux.2xlarge) (6/9)

Step: "Test" (full log | diagnosis details | 🔁 rerun)

2022-03-22T19:23:26.6480782Z FAIL [0.009s]: tes...k_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T19:23:26.6065800Z   test_type_check_reshape_false (fx.test_gradual_type.TypeCheckerTest) ... ok (0.002s)
2022-03-22T19:23:26.6091176Z   test_type_check_reshape_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.002s)
2022-03-22T19:23:26.6107248Z   test_type_check_symbolic_inferenceconv2D_maxpool2d_flatten (fx.test_gradual_type.TypeCheckerTest) ... skip (0.002s)
2022-03-22T19:23:26.6130755Z   test_type_check_transpose_False (fx.test_gradual_type.TypeCheckerTest) ... ok (0.002s)
2022-03-22T19:23:26.6156181Z   test_type_check_transpose_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:23:26.6374353Z   test_type_maxpool2d_fully_static (fx.test_gradual_type.TypeCheckerTest) ... ok (0.022s)
2022-03-22T19:23:26.6405050Z   test_type_typechecl_maxpool2d_3dinput (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:23:26.6474582Z   test_typecheck_basicblock (fx.test_gradual_type.TypeCheckerTest) ... ok (0.007s)
2022-03-22T19:23:26.6476450Z 
2022-03-22T19:23:26.6476613Z ======================================================================
2022-03-22T19:23:26.6480782Z FAIL [0.009s]: test_function_back_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T19:23:26.6481675Z Test backward compatibility for function signatures with
2022-03-22T19:23:26.6482497Z ----------------------------------------------------------------------
2022-03-22T19:23:26.6482945Z Traceback (most recent call last):
2022-03-22T19:23:26.6483363Z   File "test_fx.py", line 3674, in test_function_back_compat
2022-03-22T19:23:26.6483971Z     self.assertExpected('\n'.join(signature_strs), 'fx_backcompat_function_signatures')
2022-03-22T19:23:26.6484546Z AssertionError: "torc[5534 chars] None) -> Any\ntorch.fx.interpreter.Interprete[3375 chars]tch]" != "torc[5534 chars] None, enable_io_processing: bool = True) -> A[3410 chars]tch]"
2022-03-22T19:23:26.6485192Z   torch.fx._symbolic_trace.Tracer.__init__(self, autowrap_modules: Tuple[Callable] = (<module math>,), autowrap_functions: Tuple[Callable, ...] = (,), param_shapes_constant: bool = False) -> None
2022-03-22T19:23:26.6486116Z   torch.fx._symbolic_trace.Tracer.call_module(self, m: torch.nn.modules.module.Module, forward: Callable[..., Any], args: Tuple[Any, ...], kwargs: Dict[str, Any]) -> Any
2022-03-22T19:23:26.6486839Z   torch.fx._symbolic_trace.Tracer.create_arg(self, a: Any) -> 'Argument'
2022-03-22T19:23:26.6487651Z   torch.fx._symbolic_trace.Tracer.is_leaf_module(self, m: torch.nn.modules.module.Module, module_qualified_name: str) -> bool

See GitHub Actions build pull / linux-xenial-py3.7-gcc7 / test (default, 1, 2, linux.2xlarge) (7/9)

Step: "Test" (full log | diagnosis details | 🔁 rerun)

2022-03-22T19:22:14.3350035Z FAIL [0.009s]: tes...k_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T19:22:14.2952348Z   test_type_check_reshape_false (fx.test_gradual_type.TypeCheckerTest) ... ok (0.002s)
2022-03-22T19:22:14.2976548Z   test_type_check_reshape_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.002s)
2022-03-22T19:22:14.2992051Z   test_type_check_symbolic_inferenceconv2D_maxpool2d_flatten (fx.test_gradual_type.TypeCheckerTest) ... skip (0.001s)
2022-03-22T19:22:14.3013337Z   test_type_check_transpose_False (fx.test_gradual_type.TypeCheckerTest) ... ok (0.002s)
2022-03-22T19:22:14.3039563Z   test_type_check_transpose_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:22:14.3252343Z   test_type_maxpool2d_fully_static (fx.test_gradual_type.TypeCheckerTest) ... ok (0.021s)
2022-03-22T19:22:14.3283246Z   test_type_typechecl_maxpool2d_3dinput (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T19:22:14.3348458Z   test_typecheck_basicblock (fx.test_gradual_type.TypeCheckerTest) ... ok (0.006s)
2022-03-22T19:22:14.3349551Z 
2022-03-22T19:22:14.3349678Z ======================================================================
2022-03-22T19:22:14.3350035Z FAIL [0.009s]: test_function_back_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T19:22:14.3350452Z Test backward compatibility for function signatures with
2022-03-22T19:22:14.3351139Z ----------------------------------------------------------------------
2022-03-22T19:22:14.3351475Z Traceback (most recent call last):
2022-03-22T19:22:14.3351805Z   File "test_fx.py", line 3674, in test_function_back_compat
2022-03-22T19:22:14.3352329Z     self.assertExpected('\n'.join(signature_strs), 'fx_backcompat_function_signatures')
2022-03-22T19:22:14.3353115Z AssertionError: "torc[5534 chars] None) -> Any\ntorch.fx.interpreter.Interprete[3375 chars]tch]" != "torc[5534 chars] None, enable_io_processing: bool = True) -> A[3410 chars]tch]"
2022-03-22T19:22:14.3353992Z   torch.fx._symbolic_trace.Tracer.__init__(self, autowrap_modules: Tuple[Callable] = (<module math>,), autowrap_functions: Tuple[Callable, ...] = (,), param_shapes_constant: bool = False) -> None
2022-03-22T19:22:14.3354840Z   torch.fx._symbolic_trace.Tracer.call_module(self, m: torch.nn.modules.module.Module, forward: Callable[..., Any], args: Tuple[Any, ...], kwargs: Dict[str, Any]) -> Any
2022-03-22T19:22:14.3355451Z   torch.fx._symbolic_trace.Tracer.create_arg(self, a: Any) -> 'Argument'
2022-03-22T19:22:14.3356077Z   torch.fx._symbolic_trace.Tracer.is_leaf_module(self, m: torch.nn.modules.module.Module, module_qualified_name: str) -> bool

See GitHub Actions build pull / win-vs2019-cuda11.3-py3 / test (default, 2, 2, windows.8xlarge.nvidia.gpu) (8/9)

Step: "Test" (full log | diagnosis details | 🔁 rerun)

2022-03-22T23:55:37.7040842Z FAIL [0.016s]: tes...k_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T23:55:37.6350721Z   test_type_check_reshape_false (fx.test_gradual_type.TypeCheckerTest) ... ok (0.000s)
2022-03-22T23:55:37.6392425Z   test_type_check_reshape_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.000s)
2022-03-22T23:55:37.6422761Z   test_type_check_symbolic_inferenceconv2D_maxpool2d_flatten (fx.test_gradual_type.TypeCheckerTest) ... skip (0.000s)
2022-03-22T23:55:37.6457204Z   test_type_check_transpose_False (fx.test_gradual_type.TypeCheckerTest) ... ok (0.000s)
2022-03-22T23:55:37.6498592Z   test_type_check_transpose_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.016s)
2022-03-22T23:55:37.6879472Z   test_type_maxpool2d_fully_static (fx.test_gradual_type.TypeCheckerTest) ... ok (0.040s)
2022-03-22T23:55:37.6930476Z   test_type_typechecl_maxpool2d_3dinput (fx.test_gradual_type.TypeCheckerTest) ... ok (0.005s)
2022-03-22T23:55:37.7035884Z   test_typecheck_basicblock (fx.test_gradual_type.TypeCheckerTest) ... ok (0.008s)
2022-03-22T23:55:37.7038798Z 
2022-03-22T23:55:37.7039513Z ======================================================================
2022-03-22T23:55:37.7040842Z FAIL [0.016s]: test_function_back_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T23:55:37.7042428Z Test backward compatibility for function signatures with
2022-03-22T23:55:37.7043626Z ----------------------------------------------------------------------
2022-03-22T23:55:37.7044764Z Traceback (most recent call last):
2022-03-22T23:55:37.7046430Z   File "test_fx.py", line 3674, in test_function_back_compat
2022-03-22T23:55:37.7048112Z     self.assertExpected('\n'.join(signature_strs), 'fx_backcompat_function_signatures')
2022-03-22T23:55:37.7050767Z AssertionError: "torc[5534 chars] None) -> Any\ntorch.fx.interpreter.Interprete[3375 chars]tch]" != "torc[5534 chars] None, enable_io_processing: bool = True) -> A[3410 chars]tch]"
2022-03-22T23:55:37.7053963Z   torch.fx._symbolic_trace.Tracer.__init__(self, autowrap_modules: Tuple[Callable] = (<module math>,), autowrap_functions: Tuple[Callable, ...] = (,), param_shapes_constant: bool = False) -> None
2022-03-22T23:55:37.7057118Z   torch.fx._symbolic_trace.Tracer.call_module(self, m: torch.nn.modules.module.Module, forward: Callable[..., Any], args: Tuple[Any, ...], kwargs: Dict[str, Any]) -> Any
2022-03-22T23:55:37.7059583Z   torch.fx._symbolic_trace.Tracer.create_arg(self, a: Any) -> 'Argument'
2022-03-22T23:55:37.7062186Z   torch.fx._symbolic_trace.Tracer.is_leaf_module(self, m: torch.nn.modules.module.Module, module_qualified_name: str) -> bool

See GitHub Actions build pull / linux-bionic-rocm4.5-py3.7 / test (default, 1, 2, linux.rocm.gpu) (9/9)

Step: "Test" (full log | diagnosis details | 🔁 rerun)

2022-03-22T21:03:47.8819842Z FAIL [0.013s]: tes...k_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T21:03:47.7991040Z   test_type_check_reshape_false (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T21:03:47.8027399Z   test_type_check_reshape_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.004s)
2022-03-22T21:03:47.8054470Z   test_type_check_symbolic_inferenceconv2D_maxpool2d_flatten (fx.test_gradual_type.TypeCheckerTest) ... skip (0.003s)
2022-03-22T21:03:47.8088343Z   test_type_check_transpose_False (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T21:03:47.8123206Z   test_type_check_transpose_true (fx.test_gradual_type.TypeCheckerTest) ... ok (0.003s)
2022-03-22T21:03:47.8459787Z   test_type_maxpool2d_fully_static (fx.test_gradual_type.TypeCheckerTest) ... ok (0.033s)
2022-03-22T21:03:47.8504195Z   test_type_typechecl_maxpool2d_3dinput (fx.test_gradual_type.TypeCheckerTest) ... ok (0.004s)
2022-03-22T21:03:47.8817488Z   test_typecheck_basicblock (fx.test_gradual_type.TypeCheckerTest) ... ok (0.031s)
2022-03-22T21:03:47.8818217Z 
2022-03-22T21:03:47.8818543Z ======================================================================
2022-03-22T21:03:47.8819842Z FAIL [0.013s]: test_function_back_compat (__main__.TestFXAPIBackwardCompatibility)
2022-03-22T21:03:47.8820948Z Test backward compatibility for function signatures with
2022-03-22T21:03:47.8822597Z ----------------------------------------------------------------------
2022-03-22T21:03:47.8823518Z Traceback (most recent call last):
2022-03-22T21:03:47.8824361Z   File "test_fx.py", line 3674, in test_function_back_compat
2022-03-22T21:03:47.8825803Z     self.assertExpected('\n'.join(signature_strs), 'fx_backcompat_function_signatures')
2022-03-22T21:03:47.8827918Z AssertionError: "torc[5534 chars] None) -> Any\ntorch.fx.interpreter.Interprete[3375 chars]tch]" != "torc[5534 chars] None, enable_io_processing: bool = True) -> A[3410 chars]tch]"
2022-03-22T21:03:47.8830357Z   torch.fx._symbolic_trace.Tracer.__init__(self, autowrap_modules: Tuple[Callable] = (<module math>,), autowrap_functions: Tuple[Callable, ...] = (,), param_shapes_constant: bool = False) -> None
2022-03-22T21:03:47.8832698Z   torch.fx._symbolic_trace.Tracer.call_module(self, m: torch.nn.modules.module.Module, forward: Callable[..., Any], args: Tuple[Any, ...], kwargs: Dict[str, Any]) -> Any
2022-03-22T21:03:47.8834649Z   torch.fx._symbolic_trace.Tracer.create_arg(self, a: Any) -> 'Argument'
2022-03-22T21:03:47.8836395Z   torch.fx._symbolic_trace.Tracer.is_leaf_module(self, m: torch.nn.modules.module.Module, module_qualified_name: str) -> bool

🚧 1 fixed upstream failure:

These were probably caused by upstream breakages that were already fixed.

Please rebase on the viable/strict branch (expand for instructions)

If your commit is older than viable/strict, run these commands:

git fetch https://github.com/pytorch/pytorch viable/strict
git rebase FETCH_HEAD

This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@malfet
Copy link
Contributor Author
malfet commented Mar 22, 2022

@pytorchbot merge this

@malfet malfet added the topic: not user facing topic category label Mar 22, 2022
facebook-github-bot pushed a commit that referenced this pull request Mar 23, 2022
Summary:
Also, fix regression/typo introduced by #74549

Pull Request resolved: #74570

Approved by: https://github.com/seemethere

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/ca500f32dac25da237aab3761b28405a30517f9e

Reviewed By: seemethere, osalpekar

Differential Revision: D35065518

Pulled By: malfet

fbshipit-source-id: a8a953c35ac39a2c73fe53787bae6c2e88adb323
shahofblah pushed a commit that referenced this pull request Mar 25, 2022
Also, fix regression/typo introduced by #74549

Pull Request resolved: #74570

Approved by: https://github.com/seemethere
@facebook-github-bot facebook-github-bot deleted the gh/malfet/15/head branch March 26, 2022 14:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants
0