2022 FIFA World Cup Croatia vs. Morocco odds, picks: Predictions and best bets for Saturday's third-place match from proven soccer expert

Summarized by: Live Sports Direct
 
2022 FIFA World Cup Croatia vs. Morocco odds, picks: Predictions and best bets for Saturday's third-place match from proven soccer expert

RuntimeError at /getsummary/

RuntimeError at /getsummary/

CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 10.76 GiB total capacity; 1.46 GiB already allocated; 13.69 MiB free; 1.47 GiB reserved in total by PyTorch)
Request Method: POST
Request URL: http://192.168.1.145:4080/getsummary/?min=122&max=1000
Django Version: 3.1.1
Exception Type: RuntimeError
Exception Value:
CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 10.76 GiB total capacity; 1.46 GiB already allocated; 13.69 MiB free; 1.47 GiB reserved in total by PyTorch)
Exception Location: /usr/local/lib/python3.8/site-packages/torch/nn/modules/module.py, line 850, in convert
Python Executable: /usr/local/bin/python
Python Version: 3.8.12
Python Path:
['/app',
 '/usr/local/bin',
 '/usr/local/lib/python38.zip',
 '/usr/local/lib/python3.8',
 '/usr/local/lib/python3.8/lib-dynload',
 '/usr/local/lib/python3.8/site-packages']
Server time: Thu, 15 Dec 2022 23:31:12 +0000

Traceback Switch to copy-and-paste view

  • /usr/local/lib/python3.8/site-packages/django/core/handlers/exception.py, line 47, in inner
    1.                 response = await sync_to_async(response_for_exception)(request, exc)
    2.             return response
    3.         return inner
    4.     else:
    5.         @wraps(get_response)
    6.         def inner(request):
    7.             try:
    1.                 response = get_response(request)
    1.             except Exception as exc:
    2.                 response = response_for_exception(request, exc)
    3.             return response
    4.         return inner
    Variable Value
    exc
    RuntimeError('CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 10.76 GiB total capacity; 1.46 GiB already allocated; 13.69 MiB free; 1.47 GiB reserved in total by PyTorch)')
    get_response
    <bound method BaseHandler._get_response of <django.core.handlers.wsgi.WSGIHandler object at 0x7f79d1149e50>>
    request
    <WSGIRequest: POST '/getsummary/?min=122&max=1000'>
  • /usr/local/lib/python3.8/site-packages/django/core/handlers/base.py, line 165, in _get_response
    1.     def _get_response(self, request):
    2.         """
    3.         Resolve and call the view, then apply view, exception, and
    4.         template_response middleware. This method is everything that happens
    5.         inside the request/response middleware.
    6.         """
    7.         response = None
    1.         callback, callback_args, callback_kwargs = self.resolve_request(request)
    1.         # Apply view middleware
    2.         for middleware_method in self._view_middleware:
    3.             response = middleware_method(request, callback, callback_args, callback_kwargs)
    4.             if response:
    5.                 break
    Variable Value
    request
    <WSGIRequest: POST '/getsummary/?min=122&max=1000'>
    response
    None
    self
    <django.core.handlers.wsgi.WSGIHandler object at 0x7f79d1149e50>
  • /usr/local/lib/python3.8/site-packages/django/core/handlers/base.py, line 288, in resolve_request
    1.         if hasattr(request, 'urlconf'):
    2.             urlconf = request.urlconf
    3.             set_urlconf(urlconf)
    4.             resolver = get_resolver(urlconf)
    5.         else:
    6.             resolver = get_resolver()
    7.         # Resolve the view, and assign the match object back to the request.
    1.         resolver_match = resolver.resolve(request.path_info)
    1.         request.resolver_match = resolver_match
    2.         return resolver_match
    3.     def check_response(self, response, callback, name=None):
    4.         """
    5.         Raise an error if the view returned None or an uncalled coroutine.
    Variable Value
    request
    <WSGIRequest: POST '/getsummary/?min=122&max=1000'>
    resolver
    <URLResolver 'web_project.urls' (None:None) '^/'>
    self
    <django.core.handlers.wsgi.WSGIHandler object at 0x7f79d1149e50>
  • /usr/local/lib/python3.8/site-packages/django/urls/resolvers.py, line 545, in resolve
    1.     def resolve(self, path):
    2.         path = str(path)  # path may be a reverse_lazy object
    3.         tried = []
    4.         match = self.pattern.match(path)
    5.         if match:
    6.             new_path, args, kwargs = match
    1.             for pattern in self.url_patterns:
    1.                 try:
    2.                     sub_match = pattern.resolve(new_path)
    3.                 except Resolver404 as e:
    4.                     sub_tried = e.args[0].get('tried')
    5.                     if sub_tried is not None:
    6.                         tried.extend([pattern] + t for t in sub_tried)
    Variable Value
    args
    ()
    kwargs
    {}
    match
    ('getsummary/', (), {})
    new_path
    'getsummary/'
    path
    '/getsummary/'
    self
    <URLResolver 'web_project.urls' (None:None) '^/'>
    tried
    []
  • /usr/local/lib/python3.8/site-packages/django/utils/functional.py, line 48, in __get__
    1.         """
    2.         Call the function and put the return value in instance.__dict__ so that
    3.         subsequent attribute access on the instance returns the cached value
    4.         instead of calling cached_property.__get__().
    5.         """
    6.         if instance is None:
    7.             return self
    1.         res = instance.__dict__[self.name] = self.func(instance)
    1.         return res
    2. class classproperty:
    3.     """
    4.     Decorator that converts a method with a single cls argument into a property
    Variable Value
    cls
    <class 'django.urls.resolvers.URLResolver'>
    instance
    <URLResolver 'web_project.urls' (None:None) '^/'>
    self
    <django.utils.functional.cached_property object at 0x7f79d1149370>
  • /usr/local/lib/python3.8/site-packages/django/urls/resolvers.py, line 589, in url_patterns
    1.             return import_module(self.urlconf_name)
    2.         else:
    3.             return self.urlconf_name
    4.     @cached_property
    5.     def url_patterns(self):
    6.         # urlconf_module might be a valid set of patterns, so we default to it
    1.         patterns = getattr(self.urlconf_module, "urlpatterns", self.urlconf_module)
    1.         try:
    2.             iter(patterns)
    3.         except TypeError as e:
    4.             msg = (
    5.                 "The included URLconf '{name}' does not appear to have any "
    6.                 "patterns in it. If you see valid patterns in the file then "
    Variable Value
    self
    <URLResolver 'web_project.urls' (None:None) '^/'>
  • /usr/local/lib/python3.8/site-packages/django/utils/functional.py, line 48, in __get__
    1.         """
    2.         Call the function and put the return value in instance.__dict__ so that
    3.         subsequent attribute access on the instance returns the cached value
    4.         instead of calling cached_property.__get__().
    5.         """
    6.         if instance is None:
    7.             return self
    1.         res = instance.__dict__[self.name] = self.func(instance)
    1.         return res
    2. class classproperty:
    3.     """
    4.     Decorator that converts a method with a single cls argument into a property
    Variable Value
    cls
    <class 'django.urls.resolvers.URLResolver'>
    instance
    <URLResolver 'web_project.urls' (None:None) '^/'>
    self
    <django.utils.functional.cached_property object at 0x7f79d1149310>
  • /usr/local/lib/python3.8/site-packages/django/urls/resolvers.py, line 582, in urlconf_module
    1.                     tried.append([pattern])
    2.             raise Resolver404({'tried': tried, 'path': new_path})
    3.         raise Resolver404({'path': path})
    4.     @cached_property
    5.     def urlconf_module(self):
    6.         if isinstance(self.urlconf_name, str):
    1.             return import_module(self.urlconf_name)
    1.         else:
    2.             return self.urlconf_name
    3.     @cached_property
    4.     def url_patterns(self):
    5.         # urlconf_module might be a valid set of patterns, so we default to it
    Variable Value
    self
    <URLResolver 'web_project.urls' (None:None) '^/'>
  • /usr/local/lib/python3.8/importlib/__init__.py, line 127, in import_module
    1.             msg = ("the 'package' argument is required to perform a relative "
    2.                    "import for {!r}")
    3.             raise TypeError(msg.format(name))
    4.         for character in name:
    5.             if character != '.':
    6.                 break
    7.             level += 1
    1.     return _bootstrap._gcd_import(name[level:], package, level)
    1. _RELOADING = {}
    2. def reload(module):
    Variable Value
    level
    0
    name
    'web_project.urls'
    package
    None
  • <frozen importlib._bootstrap>, line 1014, in _gcd_import
    1. <source code not available>
    Variable Value
    level
    0
    name
    'web_project.urls'
    package
    None
  • <frozen importlib._bootstrap>, line 991, in _find_and_load
    1. <source code not available>
    Variable Value
    import_
    <function _gcd_import at 0x7f79d29f4430>
    module
    <object object at 0x7f79d29cb060>
    name
    'web_project.urls'
  • <frozen importlib._bootstrap>, line 975, in _find_and_load_unlocked
    1. <source code not available>
    Variable Value
    import_
    <function _gcd_import at 0x7f79d29f4430>
    name
    'web_project.urls'
    parent
    'web_project'
    parent_module
    <module 'web_project' from '/app/web_project/__init__.py'>
    path
    ['/app/web_project']
    spec
    ModuleSpec(name='web_project.urls', loader=<_frozen_importlib_external.SourceFileLoader object at 0x7f7953f0d0d0>, origin='/app/web_project/urls.py')
  • <frozen importlib._bootstrap>, line 671, in _load_unlocked
    1. <source code not available>
    Variable Value
    module
    <module 'web_project.urls' from '/app/web_project/urls.py'>
    spec
    ModuleSpec(name='web_project.urls', loader=<_frozen_importlib_external.SourceFileLoader object at 0x7f7953f0d0d0>, origin='/app/web_project/urls.py')
  • <frozen importlib._bootstrap_external>, line 843, in exec_module
    1. <source code not available>
    Variable Value
    code
    <code object <module> at 0x7f79580f3920, file "/app/web_project/urls.py", line 1>
    module
    <module 'web_project.urls' from '/app/web_project/urls.py'>
    self
    <_frozen_importlib_external.SourceFileLoader object at 0x7f7953f0d0d0>
  • <frozen importlib._bootstrap>, line 219, in _call_with_frames_removed
    1. <source code not available>
    Variable Value
    args
    (<code object <module> at 0x7f79580f3920, file "/app/web_project/urls.py", line 1>,
     {'__builtins__': {'ArithmeticError': <class 'ArithmeticError'>,
                       'AssertionError': <class 'AssertionError'>,
                       'AttributeError': <class 'AttributeError'>,
                       'BaseException': <class 'BaseException'>,
                       'BlockingIOError': <class 'BlockingIOError'>,
                       'BrokenPipeError': <class 'BrokenPipeError'>,
                       'BufferError': <class 'BufferError'>,
                       'BytesWarning': <class 'BytesWarning'>,
                       'ChildProcessError': <class 'ChildProcessError'>,
                       'ConnectionAbortedError': <class 'ConnectionAbortedError'>,
                       'ConnectionError': <class 'ConnectionError'>,
                       'ConnectionRefusedError': <class 'ConnectionRefusedError'>,
                       'ConnectionResetError': <class 'ConnectionResetError'>,
                       'DeprecationWarning': <class 'DeprecationWarning'>,
                       'EOFError': <class 'EOFError'>,
                       'Ellipsis': Ellipsis,
                       'EnvironmentError': <class 'OSError'>,
                       'Exception': <class 'Exception'>,
                       'False': False,
                       'FileExistsError': <class 'FileExistsError'>,
                       'FileNotFoundError': <class 'FileNotFoundError'>,
                       'FloatingPointError': <class 'FloatingPointError'>,
                       'FutureWarning': <class 'FutureWarning'>,
                       'GeneratorExit': <class 'GeneratorExit'>,
                       'IOError': <class 'OSError'>,
                       'ImportError': <class 'ImportError'>,
                       'ImportWarning': <class 'ImportWarning'>,
                       'IndentationError': <class 'IndentationError'>,
                       'IndexError': <class 'IndexError'>,
                       'InterruptedError': <class 'InterruptedError'>,
                       'IsADirectoryError': <class 'IsADirectoryError'>,
                       'KeyError': <class 'KeyError'>,
                       'KeyboardInterrupt': <class 'KeyboardInterrupt'>,
                       'LookupError': <class 'LookupError'>,
                       'MemoryError': <class 'MemoryError'>,
                       'ModuleNotFoundError': <class 'ModuleNotFoundError'>,
                       'NameError': <class 'NameError'>,
                       'None': None,
                       'NotADirectoryError': <class 'NotADirectoryError'>,
                       'NotImplemented': NotImplemented,
                       'NotImplementedError': <class 'NotImplementedError'>,
                       'OSError': <class 'OSError'>,
                       'OverflowError': <class 'OverflowError'>,
                       'PendingDeprecationWarning': <class 'PendingDeprecationWarning'>,
                       'PermissionError': <class 'PermissionError'>,
                       'ProcessLookupError': <class 'ProcessLookupError'>,
                       'RecursionError': <class 'RecursionError'>,
                       'ReferenceError': <class 'ReferenceError'>,
                       'ResourceWarning': <class 'ResourceWarning'>,
                       'RuntimeError': <class 'RuntimeError'>,
                       'RuntimeWarning': <class 'RuntimeWarning'>,
                       'StopAsyncIteration': <class 'StopAsyncIteration'>,
                       'StopIteration': <class 'StopIteration'>,
                       'SyntaxError': <class 'SyntaxError'>,
                       'SyntaxWarning': <class 'SyntaxWarning'>,
                       'SystemError': <class 'SystemError'>,
                       'SystemExit': <class 'SystemExit'>,
                       'TabError': <class 'TabError'>,
                       'TimeoutError': <class 'TimeoutError'>,
                       'True': True,
                       'TypeError': <class 'TypeError'>,
                       'UnboundLocalError': <class 'UnboundLocalError'>,
                       'UnicodeDecodeError': <class 'UnicodeDecodeError'>,
                       'UnicodeEncodeError': <class 'UnicodeEncodeError'>,
                       'UnicodeError': <class 'UnicodeError'>,
                       'Unic… <trimmed 10351 bytes string>
    f
    <built-in function exec>
    kwds
    {}
  • /app/web_project/urls.py, line 5, in <module>
    1. from django.contrib import admin
    2. from django.urls import include, path
    3. urlpatterns = [
    1.     path("", include("hello.urls")),
    1.     path('admin/', admin.site.urls)
    2. ]
    Variable Value
    __builtins__
    {'ArithmeticError': <class 'ArithmeticError'>,
     'AssertionError': <class 'AssertionError'>,
     'AttributeError': <class 'AttributeError'>,
     'BaseException': <class 'BaseException'>,
     'BlockingIOError': <class 'BlockingIOError'>,
     'BrokenPipeError': <class 'BrokenPipeError'>,
     'BufferError': <class 'BufferError'>,
     'BytesWarning': <class 'BytesWarning'>,
     'ChildProcessError': <class 'ChildProcessError'>,
     'ConnectionAbortedError': <class 'ConnectionAbortedError'>,
     'ConnectionError': <class 'ConnectionError'>,
     'ConnectionRefusedError': <class 'ConnectionRefusedError'>,
     'ConnectionResetError': <class 'ConnectionResetError'>,
     'DeprecationWarning': <class 'DeprecationWarning'>,
     'EOFError': <class 'EOFError'>,
     'Ellipsis': Ellipsis,
     'EnvironmentError': <class 'OSError'>,
     'Exception': <class 'Exception'>,
     'False': False,
     'FileExistsError': <class 'FileExistsError'>,
     'FileNotFoundError': <class 'FileNotFoundError'>,
     'FloatingPointError': <class 'FloatingPointError'>,
     'FutureWarning': <class 'FutureWarning'>,
     'GeneratorExit': <class 'GeneratorExit'>,
     'IOError': <class 'OSError'>,
     'ImportError': <class 'ImportError'>,
     'ImportWarning': <class 'ImportWarning'>,
     'IndentationError': <class 'IndentationError'>,
     'IndexError': <class 'IndexError'>,
     'InterruptedError': <class 'InterruptedError'>,
     'IsADirectoryError': <class 'IsADirectoryError'>,
     'KeyError': <class 'KeyError'>,
     'KeyboardInterrupt': <class 'KeyboardInterrupt'>,
     'LookupError': <class 'LookupError'>,
     'MemoryError': <class 'MemoryError'>,
     'ModuleNotFoundError': <class 'ModuleNotFoundError'>,
     'NameError': <class 'NameError'>,
     'None': None,
     'NotADirectoryError': <class 'NotADirectoryError'>,
     'NotImplemented': NotImplemented,
     'NotImplementedError': <class 'NotImplementedError'>,
     'OSError': <class 'OSError'>,
     'OverflowError': <class 'OverflowError'>,
     'PendingDeprecationWarning': <class 'PendingDeprecationWarning'>,
     'PermissionError': <class 'PermissionError'>,
     'ProcessLookupError': <class 'ProcessLookupError'>,
     'RecursionError': <class 'RecursionError'>,
     'ReferenceError': <class 'ReferenceError'>,
     'ResourceWarning': <class 'ResourceWarning'>,
     'RuntimeError': <class 'RuntimeError'>,
     'RuntimeWarning': <class 'RuntimeWarning'>,
     'StopAsyncIteration': <class 'StopAsyncIteration'>,
     'StopIteration': <class 'StopIteration'>,
     'SyntaxError': <class 'SyntaxError'>,
     'SyntaxWarning': <class 'SyntaxWarning'>,
     'SystemError': <class 'SystemError'>,
     'SystemExit': <class 'SystemExit'>,
     'TabError': <class 'TabError'>,
     'TimeoutError': <class 'TimeoutError'>,
     'True': True,
     'TypeError': <class 'TypeError'>,
     'UnboundLocalError': <class 'UnboundLocalError'>,
     'UnicodeDecodeError': <class 'UnicodeDecodeError'>,
     'UnicodeEncodeError': <class 'UnicodeEncodeError'>,
     'UnicodeError': <class 'UnicodeError'>,
     'UnicodeTranslateError': <class 'UnicodeTranslateError'>,
     'UnicodeWarning': <class 'UnicodeWarning'>,
     'UserWarning': <class 'UserWarning'>,
     'ValueError': <class 'ValueError'>,
     'Warning': <class 'Warning'>,
     'ZeroDivisionError': <class 'ZeroDivisionError'>,
     '__build_class__': <built-in function __build_class__>,
     '__debug__': True,
     '__doc__': 'Built-in functions, exceptions, and other objects.\n'
                '\n'
                "Noteworthy: None is the `nil' object; Ellipsis represents `...' "
                'in slices.',
     '__import__': <built-in function __import__>,
     '__loader__': <class '_frozen_importlib.BuiltinImporter'>,
     '__name__': 'builtins',
     '__package__': '',
     '__pybind11_internals_v4_gcc_libstdcpp_cxxabi1011__': <capsule object NULL at 0x7f79ba121d50>,
     '__spec__': ModuleSpec(name='builtins', loader=<class '_frozen_importlib.BuiltinImporter'>),
     'abs': <built-in function abs>,
     'all': <built-in function all>,
     'any': <built-in function any>,
     'ascii': <built-in function ascii>,
     'bin': <built-in function bin>,
     'bool': <class 'bool'>,
     'breakpoint': <built-in function breakpoint>,
     'bytearray': <class 'bytearray'>,
     'bytes': <class 'bytes'>,
     'callable': <built-in function callable>,
     'chr': <built-in function chr>,
     'classmethod': <class 'class… <trimmed 6683 bytes string>
    __cached__
    '/app/web_project/__pycache__/urls.cpython-38.pyc'
    __doc__
    None
    __file__
    '/app/web_project/urls.py'
    __loader__
    <_frozen_importlib_external.SourceFileLoader object at 0x7f7953f0d0d0>
    __name__
    'web_project.urls'
    __package__
    'web_project'
    __spec__
    ModuleSpec(name='web_project.urls', loader=<_frozen_importlib_external.SourceFileLoader object at 0x7f7953f0d0d0>, origin='/app/web_project/urls.py')
    admin
    <module 'django.contrib.admin' from '/usr/local/lib/python3.8/site-packages/django/contrib/admin/__init__.py'>
    include
    <function include at 0x7f79d114f5e0>
    path
    functools.partial(<function _path at 0x7f79d114f700>, Pattern=<class 'django.urls.resolvers.RoutePattern'>)
  • /usr/local/lib/python3.8/site-packages/django/urls/conf.py, line 34, in include
    1.                 'provide the namespace argument to include() instead.' % len(arg)
    2.             )
    3.     else:
    4.         # No namespace hint - use manually provided namespace.
    5.         urlconf_module = arg
    6.     if isinstance(urlconf_module, str):
    1.         urlconf_module = import_module(urlconf_module)
    1.     patterns = getattr(urlconf_module, 'urlpatterns', urlconf_module)
    2.     app_name = getattr(urlconf_module, 'app_name', app_name)
    3.     if namespace and not app_name:
    4.         raise ImproperlyConfigured(
    5.             'Specifying a namespace in include() without providing an app_name '
    6.             'is not supported. Set the app_name attribute in the included '
    Variable Value
    app_name
    None
    arg
    'hello.urls'
    namespace
    None
    urlconf_module
    'hello.urls'
  • /usr/local/lib/python3.8/importlib/__init__.py, line 127, in import_module
    1.             msg = ("the 'package' argument is required to perform a relative "
    2.                    "import for {!r}")
    3.             raise TypeError(msg.format(name))
    4.         for character in name:
    5.             if character != '.':
    6.                 break
    7.             level += 1
    1.     return _bootstrap._gcd_import(name[level:], package, level)
    1. _RELOADING = {}
    2. def reload(module):
    Variable Value
    level
    0
    name
    'hello.urls'
    package
    None
  • <frozen importlib._bootstrap>, line 1014, in _gcd_import
    1. <source code not available>
    Variable Value
    level
    0
    name
    'hello.urls'
    package
    None
  • <frozen importlib._bootstrap>, line 991, in _find_and_load
    1. <source code not available>
    Variable Value
    import_
    <function _gcd_import at 0x7f79d29f4430>
    module
    <object object at 0x7f79d29cb060>
    name
    'hello.urls'
  • <frozen importlib._bootstrap>, line 975, in _find_and_load_unlocked
    1. <source code not available>
    Variable Value
    import_
    <function _gcd_import at 0x7f79d29f4430>
    name
    'hello.urls'
    parent
    'hello'
    parent_module
    <module 'hello' from '/app/hello/__init__.py'>
    path
    ['/app/hello']
    spec
    ModuleSpec(name='hello.urls', loader=<_frozen_importlib_external.SourceFileLoader object at 0x7f7953f0d940>, origin='/app/hello/urls.py')
  • <frozen importlib._bootstrap>, line 671, in _load_unlocked
    1. <source code not available>
    Variable Value
    module
    <module 'hello.urls' from '/app/hello/urls.py'>
    spec
    ModuleSpec(name='hello.urls', loader=<_frozen_importlib_external.SourceFileLoader object at 0x7f7953f0d940>, origin='/app/hello/urls.py')
  • <frozen importlib._bootstrap_external>, line 843, in exec_module
    1. <source code not available>
    Variable Value
    code
    <code object <module> at 0x7f79580f35b0, file "/app/hello/urls.py", line 1>
    module
    <module 'hello.urls' from '/app/hello/urls.py'>
    self
    <_frozen_importlib_external.SourceFileLoader object at 0x7f7953f0d940>
  • <frozen importlib._bootstrap>, line 219, in _call_with_frames_removed
    1. <source code not available>
    Variable Value
    args
    (<code object <module> at 0x7f79580f35b0, file "/app/hello/urls.py", line 1>,
     {'__builtins__': {'ArithmeticError': <class 'ArithmeticError'>,
                       'AssertionError': <class 'AssertionError'>,
                       'AttributeError': <class 'AttributeError'>,
                       'BaseException': <class 'BaseException'>,
                       'BlockingIOError': <class 'BlockingIOError'>,
                       'BrokenPipeError': <class 'BrokenPipeError'>,
                       'BufferError': <class 'BufferError'>,
                       'BytesWarning': <class 'BytesWarning'>,
                       'ChildProcessError': <class 'ChildProcessError'>,
                       'ConnectionAbortedError': <class 'ConnectionAbortedError'>,
                       'ConnectionError': <class 'ConnectionError'>,
                       'ConnectionRefusedError': <class 'ConnectionRefusedError'>,
                       'ConnectionResetError': <class 'ConnectionResetError'>,
                       'DeprecationWarning': <class 'DeprecationWarning'>,
                       'EOFError': <class 'EOFError'>,
                       'Ellipsis': Ellipsis,
                       'EnvironmentError': <class 'OSError'>,
                       'Exception': <class 'Exception'>,
                       'False': False,
                       'FileExistsError': <class 'FileExistsError'>,
                       'FileNotFoundError': <class 'FileNotFoundError'>,
                       'FloatingPointError': <class 'FloatingPointError'>,
                       'FutureWarning': <class 'FutureWarning'>,
                       'GeneratorExit': <class 'GeneratorExit'>,
                       'IOError': <class 'OSError'>,
                       'ImportError': <class 'ImportError'>,
                       'ImportWarning': <class 'ImportWarning'>,
                       'IndentationError': <class 'IndentationError'>,
                       'IndexError': <class 'IndexError'>,
                       'InterruptedError': <class 'InterruptedError'>,
                       'IsADirectoryError': <class 'IsADirectoryError'>,
                       'KeyError': <class 'KeyError'>,
                       'KeyboardInterrupt': <class 'KeyboardInterrupt'>,
                       'LookupError': <class 'LookupError'>,
                       'MemoryError': <class 'MemoryError'>,
                       'ModuleNotFoundError': <class 'ModuleNotFoundError'>,
                       'NameError': <class 'NameError'>,
                       'None': None,
                       'NotADirectoryError': <class 'NotADirectoryError'>,
                       'NotImplemented': NotImplemented,
                       'NotImplementedError': <class 'NotImplementedError'>,
                       'OSError': <class 'OSError'>,
                       'OverflowError': <class 'OverflowError'>,
                       'PendingDeprecationWarning': <class 'PendingDeprecationWarning'>,
                       'PermissionError': <class 'PermissionError'>,
                       'ProcessLookupError': <class 'ProcessLookupError'>,
                       'RecursionError': <class 'RecursionError'>,
                       'ReferenceError': <class 'ReferenceError'>,
                       'ResourceWarning': <class 'ResourceWarning'>,
                       'RuntimeError': <class 'RuntimeError'>,
                       'RuntimeWarning': <class 'RuntimeWarning'>,
                       'StopAsyncIteration': <class 'StopAsyncIteration'>,
                       'StopIteration': <class 'StopIteration'>,
                       'SyntaxError': <class 'SyntaxError'>,
                       'SyntaxWarning': <class 'SyntaxWarning'>,
                       'SystemError': <class 'SystemError'>,
                       'SystemExit': <class 'SystemExit'>,
                       'TabError': <class 'TabError'>,
                       'TimeoutError': <class 'TimeoutError'>,
                       'True': True,
                       'TypeError': <class 'TypeError'>,
                       'UnboundLocalError': <class 'UnboundLocalError'>,
                       'UnicodeDecodeError': <class 'UnicodeDecodeError'>,
                       'UnicodeEncodeError': <class 'UnicodeEncodeError'>,
                       'UnicodeError': <class 'UnicodeError'>,
                       'UnicodeTra… <trimmed 10135 bytes string>
    f
    <built-in function exec>
    kwds
    {}
  • /app/hello/urls.py, line 2, in <module>
    1. from django.urls import path
    1. from hello import views
    1. urlpatterns = [
    2.     path("", views.home, name="home"),
    3.     path("text_summary/", views.text_summary, name="bart summarizing"),
    4.     path("getsummary/", views.getsummary, name="bart summarizing"),
    5.     path("title_summary/", views.title_summary, name="title summarizing"),
    Variable Value
    __builtins__
    {'ArithmeticError': <class 'ArithmeticError'>,
     'AssertionError': <class 'AssertionError'>,
     'AttributeError': <class 'AttributeError'>,
     'BaseException': <class 'BaseException'>,
     'BlockingIOError': <class 'BlockingIOError'>,
     'BrokenPipeError': <class 'BrokenPipeError'>,
     'BufferError': <class 'BufferError'>,
     'BytesWarning': <class 'BytesWarning'>,
     'ChildProcessError': <class 'ChildProcessError'>,
     'ConnectionAbortedError': <class 'ConnectionAbortedError'>,
     'ConnectionError': <class 'ConnectionError'>,
     'ConnectionRefusedError': <class 'ConnectionRefusedError'>,
     'ConnectionResetError': <class 'ConnectionResetError'>,
     'DeprecationWarning': <class 'DeprecationWarning'>,
     'EOFError': <class 'EOFError'>,
     'Ellipsis': Ellipsis,
     'EnvironmentError': <class 'OSError'>,
     'Exception': <class 'Exception'>,
     'False': False,
     'FileExistsError': <class 'FileExistsError'>,
     'FileNotFoundError': <class 'FileNotFoundError'>,
     'FloatingPointError': <class 'FloatingPointError'>,
     'FutureWarning': <class 'FutureWarning'>,
     'GeneratorExit': <class 'GeneratorExit'>,
     'IOError': <class 'OSError'>,
     'ImportError': <class 'ImportError'>,
     'ImportWarning': <class 'ImportWarning'>,
     'IndentationError': <class 'IndentationError'>,
     'IndexError': <class 'IndexError'>,
     'InterruptedError': <class 'InterruptedError'>,
     'IsADirectoryError': <class 'IsADirectoryError'>,
     'KeyError': <class 'KeyError'>,
     'KeyboardInterrupt': <class 'KeyboardInterrupt'>,
     'LookupError': <class 'LookupError'>,
     'MemoryError': <class 'MemoryError'>,
     'ModuleNotFoundError': <class 'ModuleNotFoundError'>,
     'NameError': <class 'NameError'>,
     'None': None,
     'NotADirectoryError': <class 'NotADirectoryError'>,
     'NotImplemented': NotImplemented,
     'NotImplementedError': <class 'NotImplementedError'>,
     'OSError': <class 'OSError'>,
     'OverflowError': <class 'OverflowError'>,
     'PendingDeprecationWarning': <class 'PendingDeprecationWarning'>,
     'PermissionError': <class 'PermissionError'>,
     'ProcessLookupError': <class 'ProcessLookupError'>,
     'RecursionError': <class 'RecursionError'>,
     'ReferenceError': <class 'ReferenceError'>,
     'ResourceWarning': <class 'ResourceWarning'>,
     'RuntimeError': <class 'RuntimeError'>,
     'RuntimeWarning': <class 'RuntimeWarning'>,
     'StopAsyncIteration': <class 'StopAsyncIteration'>,
     'StopIteration': <class 'StopIteration'>,
     'SyntaxError': <class 'SyntaxError'>,
     'SyntaxWarning': <class 'SyntaxWarning'>,
     'SystemError': <class 'SystemError'>,
     'SystemExit': <class 'SystemExit'>,
     'TabError': <class 'TabError'>,
     'TimeoutError': <class 'TimeoutError'>,
     'True': True,
     'TypeError': <class 'TypeError'>,
     'UnboundLocalError': <class 'UnboundLocalError'>,
     'UnicodeDecodeError': <class 'UnicodeDecodeError'>,
     'UnicodeEncodeError': <class 'UnicodeEncodeError'>,
     'UnicodeError': <class 'UnicodeError'>,
     'UnicodeTranslateError': <class 'UnicodeTranslateError'>,
     'UnicodeWarning': <class 'UnicodeWarning'>,
     'UserWarning': <class 'UserWarning'>,
     'ValueError': <class 'ValueError'>,
     'Warning': <class 'Warning'>,
     'ZeroDivisionError': <class 'ZeroDivisionError'>,
     '__build_class__': <built-in function __build_class__>,
     '__debug__': True,
     '__doc__': 'Built-in functions, exceptions, and other objects.\n'
                '\n'
                "Noteworthy: None is the `nil' object; Ellipsis represents `...' "
                'in slices.',
     '__import__': <built-in function __import__>,
     '__loader__': <class '_frozen_importlib.BuiltinImporter'>,
     '__name__': 'builtins',
     '__package__': '',
     '__pybind11_internals_v4_gcc_libstdcpp_cxxabi1011__': <capsule object NULL at 0x7f79ba121d50>,
     '__spec__': ModuleSpec(name='builtins', loader=<class '_frozen_importlib.BuiltinImporter'>),
     'abs': <built-in function abs>,
     'all': <built-in function all>,
     'any': <built-in function any>,
     'ascii': <built-in function ascii>,
     'bin': <built-in function bin>,
     'bool': <class 'bool'>,
     'breakpoint': <built-in function breakpoint>,
     'bytearray': <class 'bytearray'>,
     'bytes': <class 'bytes'>,
     'callable': <built-in function callable>,
     'chr': <built-in function chr>,
     'classmethod': <class 'class… <trimmed 6683 bytes string>
    __cached__
    '/app/hello/__pycache__/urls.cpython-38.pyc'
    __doc__
    None
    __file__
    '/app/hello/urls.py'
    __loader__
    <_frozen_importlib_external.SourceFileLoader object at 0x7f7953f0d940>
    __name__
    'hello.urls'
    __package__
    'hello'
    __spec__
    ModuleSpec(name='hello.urls', loader=<_frozen_importlib_external.SourceFileLoader object at 0x7f7953f0d940>, origin='/app/hello/urls.py')
    path
    functools.partial(<function _path at 0x7f79d114f700>, Pattern=<class 'django.urls.resolvers.RoutePattern'>)
  • /app/hello/views.py, line 4, in <module>
    1. from django.http import HttpResponse
    2. from django.http.response import FileResponse
    3. from django.views.decorators.csrf import csrf_exempt
    1. from hello import config
    1. import json
    2. #from transformers import pipeline
    3. #from keytotext import pipeline
    4. import http
    Variable Value
    FileResponse
    <class 'django.http.response.FileResponse'>
    HttpResponse
    <class 'django.http.response.HttpResponse'>
    __builtins__
    {'ArithmeticError': <class 'ArithmeticError'>,
     'AssertionError': <class 'AssertionError'>,
     'AttributeError': <class 'AttributeError'>,
     'BaseException': <class 'BaseException'>,
     'BlockingIOError': <class 'BlockingIOError'>,
     'BrokenPipeError': <class 'BrokenPipeError'>,
     'BufferError': <class 'BufferError'>,
     'BytesWarning': <class 'BytesWarning'>,
     'ChildProcessError': <class 'ChildProcessError'>,
     'ConnectionAbortedError': <class 'ConnectionAbortedError'>,
     'ConnectionError': <class 'ConnectionError'>,
     'ConnectionRefusedError': <class 'ConnectionRefusedError'>,
     'ConnectionResetError': <class 'ConnectionResetError'>,
     'DeprecationWarning': <class 'DeprecationWarning'>,
     'EOFError': <class 'EOFError'>,
     'Ellipsis': Ellipsis,
     'EnvironmentError': <class 'OSError'>,
     'Exception': <class 'Exception'>,
     'False': False,
     'FileExistsError': <class 'FileExistsError'>,
     'FileNotFoundError': <class 'FileNotFoundError'>,
     'FloatingPointError': <class 'FloatingPointError'>,
     'FutureWarning': <class 'FutureWarning'>,
     'GeneratorExit': <class 'GeneratorExit'>,
     'IOError': <class 'OSError'>,
     'ImportError': <class 'ImportError'>,
     'ImportWarning': <class 'ImportWarning'>,
     'IndentationError': <class 'IndentationError'>,
     'IndexError': <class 'IndexError'>,
     'InterruptedError': <class 'InterruptedError'>,
     'IsADirectoryError': <class 'IsADirectoryError'>,
     'KeyError': <class 'KeyError'>,
     'KeyboardInterrupt': <class 'KeyboardInterrupt'>,
     'LookupError': <class 'LookupError'>,
     'MemoryError': <class 'MemoryError'>,
     'ModuleNotFoundError': <class 'ModuleNotFoundError'>,
     'NameError': <class 'NameError'>,
     'None': None,
     'NotADirectoryError': <class 'NotADirectoryError'>,
     'NotImplemented': NotImplemented,
     'NotImplementedError': <class 'NotImplementedError'>,
     'OSError': <class 'OSError'>,
     'OverflowError': <class 'OverflowError'>,
     'PendingDeprecationWarning': <class 'PendingDeprecationWarning'>,
     'PermissionError': <class 'PermissionError'>,
     'ProcessLookupError': <class 'ProcessLookupError'>,
     'RecursionError': <class 'RecursionError'>,
     'ReferenceError': <class 'ReferenceError'>,
     'ResourceWarning': <class 'ResourceWarning'>,
     'RuntimeError': <class 'RuntimeError'>,
     'RuntimeWarning': <class 'RuntimeWarning'>,
     'StopAsyncIteration': <class 'StopAsyncIteration'>,
     'StopIteration': <class 'StopIteration'>,
     'SyntaxError': <class 'SyntaxError'>,
     'SyntaxWarning': <class 'SyntaxWarning'>,
     'SystemError': <class 'SystemError'>,
     'SystemExit': <class 'SystemExit'>,
     'TabError': <class 'TabError'>,
     'TimeoutError': <class 'TimeoutError'>,
     'True': True,
     'TypeError': <class 'TypeError'>,
     'UnboundLocalError': <class 'UnboundLocalError'>,
     'UnicodeDecodeError': <class 'UnicodeDecodeError'>,
     'UnicodeEncodeError': <class 'UnicodeEncodeError'>,
     'UnicodeError': <class 'UnicodeError'>,
     'UnicodeTranslateError': <class 'UnicodeTranslateError'>,
     'UnicodeWarning': <class 'UnicodeWarning'>,
     'UserWarning': <class 'UserWarning'>,
     'ValueError': <class 'ValueError'>,
     'Warning': <class 'Warning'>,
     'ZeroDivisionError': <class 'ZeroDivisionError'>,
     '__build_class__': <built-in function __build_class__>,
     '__debug__': True,
     '__doc__': 'Built-in functions, exceptions, and other objects.\n'
                '\n'
                "Noteworthy: None is the `nil' object; Ellipsis represents `...' "
                'in slices.',
     '__import__': <built-in function __import__>,
     '__loader__': <class '_frozen_importlib.BuiltinImporter'>,
     '__name__': 'builtins',
     '__package__': '',
     '__pybind11_internals_v4_gcc_libstdcpp_cxxabi1011__': <capsule object NULL at 0x7f79ba121d50>,
     '__spec__': ModuleSpec(name='builtins', loader=<class '_frozen_importlib.BuiltinImporter'>),
     'abs': <built-in function abs>,
     'all': <built-in function all>,
     'any': <built-in function any>,
     'ascii': <built-in function ascii>,
     'bin': <built-in function bin>,
     'bool': <class 'bool'>,
     'breakpoint': <built-in function breakpoint>,
     'bytearray': <class 'bytearray'>,
     'bytes': <class 'bytes'>,
     'callable': <built-in function callable>,
     'chr': <built-in function chr>,
     'classmethod': <class 'class… <trimmed 6683 bytes string>
    __cached__
    '/app/hello/__pycache__/views.cpython-38.pyc'
    __doc__
    None
    __file__
    '/app/hello/views.py'
    __loader__
    <_frozen_importlib_external.SourceFileLoader object at 0x7f7953b3ca00>
    __name__
    'hello.views'
    __package__
    'hello'
    __spec__
    ModuleSpec(name='hello.views', loader=<_frozen_importlib_external.SourceFileLoader object at 0x7f7953b3ca00>, origin='/app/hello/views.py')
    csrf_exempt
    <function csrf_exempt at 0x7f79d0e281f0>
  • /app/hello/config.py, line 17, in <module>
    1. #tokenizer = PegasusTokenizer.from_pretrained('google/pegasus-reddit_tifu')
    2. #model = BartForConditionalGeneration.from_pretrained('facebook/bart-large-cnn').to(device)
    3. #tokenizer = BartTokenizer.from_pretrained('facebook/bart-large-cnn')
    4. #model = BartForConditionalGeneration.from_pretrained('philschmid/bart-large-cnn-samsum').to(device)
    1. model = BartForConditionalGeneration.from_pretrained('./bart-large-cnn-samsum').to(device)
    1. #model.save_pretrained("./bart-large-cnn-samsum")
    2. #tokenizer = BartTokenizer.from_pretrained('philschmid/bart-large-cnn-samsum')
    3. tokenizer = BartTokenizer.from_pretrained('./bart-large-cnn-samsum')
    Variable Value
    BartConfig
    <class 'transformers.models.bart.configuration_bart.BartConfig'>
    BartForConditionalGeneration
    <class 'transformers.models.bart.modeling_bart.BartForConditionalGeneration'>
    BartTokenizer
    <class 'transformers.models.bart.tokenization_bart.BartTokenizer'>
    __builtins__
    {'ArithmeticError': <class 'ArithmeticError'>,
     'AssertionError': <class 'AssertionError'>,
     'AttributeError': <class 'AttributeError'>,
     'BaseException': <class 'BaseException'>,
     'BlockingIOError': <class 'BlockingIOError'>,
     'BrokenPipeError': <class 'BrokenPipeError'>,
     'BufferError': <class 'BufferError'>,
     'BytesWarning': <class 'BytesWarning'>,
     'ChildProcessError': <class 'ChildProcessError'>,
     'ConnectionAbortedError': <class 'ConnectionAbortedError'>,
     'ConnectionError': <class 'ConnectionError'>,
     'ConnectionRefusedError': <class 'ConnectionRefusedError'>,
     'ConnectionResetError': <class 'ConnectionResetError'>,
     'DeprecationWarning': <class 'DeprecationWarning'>,
     'EOFError': <class 'EOFError'>,
     'Ellipsis': Ellipsis,
     'EnvironmentError': <class 'OSError'>,
     'Exception': <class 'Exception'>,
     'False': False,
     'FileExistsError': <class 'FileExistsError'>,
     'FileNotFoundError': <class 'FileNotFoundError'>,
     'FloatingPointError': <class 'FloatingPointError'>,
     'FutureWarning': <class 'FutureWarning'>,
     'GeneratorExit': <class 'GeneratorExit'>,
     'IOError': <class 'OSError'>,
     'ImportError': <class 'ImportError'>,
     'ImportWarning': <class 'ImportWarning'>,
     'IndentationError': <class 'IndentationError'>,
     'IndexError': <class 'IndexError'>,
     'InterruptedError': <class 'InterruptedError'>,
     'IsADirectoryError': <class 'IsADirectoryError'>,
     'KeyError': <class 'KeyError'>,
     'KeyboardInterrupt': <class 'KeyboardInterrupt'>,
     'LookupError': <class 'LookupError'>,
     'MemoryError': <class 'MemoryError'>,
     'ModuleNotFoundError': <class 'ModuleNotFoundError'>,
     'NameError': <class 'NameError'>,
     'None': None,
     'NotADirectoryError': <class 'NotADirectoryError'>,
     'NotImplemented': NotImplemented,
     'NotImplementedError': <class 'NotImplementedError'>,
     'OSError': <class 'OSError'>,
     'OverflowError': <class 'OverflowError'>,
     'PendingDeprecationWarning': <class 'PendingDeprecationWarning'>,
     'PermissionError': <class 'PermissionError'>,
     'ProcessLookupError': <class 'ProcessLookupError'>,
     'RecursionError': <class 'RecursionError'>,
     'ReferenceError': <class 'ReferenceError'>,
     'ResourceWarning': <class 'ResourceWarning'>,
     'RuntimeError': <class 'RuntimeError'>,
     'RuntimeWarning': <class 'RuntimeWarning'>,
     'StopAsyncIteration': <class 'StopAsyncIteration'>,
     'StopIteration': <class 'StopIteration'>,
     'SyntaxError': <class 'SyntaxError'>,
     'SyntaxWarning': <class 'SyntaxWarning'>,
     'SystemError': <class 'SystemError'>,
     'SystemExit': <class 'SystemExit'>,
     'TabError': <class 'TabError'>,
     'TimeoutError': <class 'TimeoutError'>,
     'True': True,
     'TypeError': <class 'TypeError'>,
     'UnboundLocalError': <class 'UnboundLocalError'>,
     'UnicodeDecodeError': <class 'UnicodeDecodeError'>,
     'UnicodeEncodeError': <class 'UnicodeEncodeError'>,
     'UnicodeError': <class 'UnicodeError'>,
     'UnicodeTranslateError': <class 'UnicodeTranslateError'>,
     'UnicodeWarning': <class 'UnicodeWarning'>,
     'UserWarning': <class 'UserWarning'>,
     'ValueError': <class 'ValueError'>,
     'Warning': <class 'Warning'>,
     'ZeroDivisionError': <class 'ZeroDivisionError'>,
     '__build_class__': <built-in function __build_class__>,
     '__debug__': True,
     '__doc__': 'Built-in functions, exceptions, and other objects.\n'
                '\n'
                "Noteworthy: None is the `nil' object; Ellipsis represents `...' "
                'in slices.',
     '__import__': <built-in function __import__>,
     '__loader__': <class '_frozen_importlib.BuiltinImporter'>,
     '__name__': 'builtins',
     '__package__': '',
     '__pybind11_internals_v4_gcc_libstdcpp_cxxabi1011__': <capsule object NULL at 0x7f79ba121d50>,
     '__spec__': ModuleSpec(name='builtins', loader=<class '_frozen_importlib.BuiltinImporter'>),
     'abs': <built-in function abs>,
     'all': <built-in function all>,
     'any': <built-in function any>,
     'ascii': <built-in function ascii>,
     'bin': <built-in function bin>,
     'bool': <class 'bool'>,
     'breakpoint': <built-in function breakpoint>,
     'bytearray': <class 'bytearray'>,
     'bytes': <class 'bytes'>,
     'callable': <built-in function callable>,
     'chr': <built-in function chr>,
     'classmethod': <class 'class… <trimmed 6683 bytes string>
    __cached__
    '/app/hello/__pycache__/config.cpython-38.pyc'
    __doc__
    None
    __file__
    '/app/hello/config.py'
    __loader__
    <_frozen_importlib_external.SourceFileLoader object at 0x7f79539df640>
    __name__
    'hello.config'
    __package__
    'hello'
    __spec__
    ModuleSpec(name='hello.config', loader=<_frozen_importlib_external.SourceFileLoader object at 0x7f79539df640>, origin='/app/hello/config.py')
    device
    device(type='cuda')
    torch
    <module 'torch' from '/usr/local/lib/python3.8/site-packages/torch/__init__.py'>
  • /usr/local/lib/python3.8/site-packages/torch/nn/modules/module.py, line 852, in to
    1.         def convert(t):
    2.             if convert_to_format is not None and t.dim() in (4, 5):
    3.                 return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None,
    4.                             non_blocking, memory_format=convert_to_format)
    5.             return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
    1.         return self._apply(convert)
    1.     def register_backward_hook(
    2.         self, hook: Callable[['Module', _grad_t, _grad_t], Union[None, Tensor]]
    3.     ) -> RemovableHandle:
    4.         r"""Registers a backward hook on the module.
    Variable Value
    args
    (device(type='cuda'),)
    convert
    <function Module.to.<locals>.convert at 0x7f7960363430>
    convert_to_format
    None
    device
    device(type='cuda')
    dtype
    None
    kwargs
    {}
    non_blocking
    False
    self
    BartForConditionalGeneration(
      (model): BartModel(
        (shared): Embedding(50264, 1024, padding_idx=1)
        (encoder): BartEncoder(
          (embed_tokens): Embedding(50264, 1024, padding_idx=1)
          (embed_positions): BartLearnedPositionalEmbedding(1026, 1024)
          (layers): ModuleList(
            (0): BartEncoderLayer(
              (self_attn): BartAttention(
                (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
              )
              (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
              (fc1): Linear(in_features=1024, out_features=4096, bias=True)
              (fc2): Linear(in_features=4096, out_features=1024, bias=True)
              (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            )
            (1): BartEncoderLayer(
              (self_attn): BartAttention(
                (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
              )
              (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
              (fc1): Linear(in_features=1024, out_features=4096, bias=True)
              (fc2): Linear(in_features=4096, out_features=1024, bias=True)
              (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            )
            (2): BartEncoderLayer(
              (self_attn): BartAttention(
                (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
              )
              (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
              (fc1): Linear(in_features=1024, out_features=4096, bias=True)
              (fc2): Linear(in_features=4096, out_features=1024, bias=True)
              (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            )
            (3): BartEncoderLayer(
              (self_attn): BartAttention(
                (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
              )
              (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
              (fc1): Linear(in_features=1024, out_features=4096, bias=True)
              (fc2): Linear(in_features=4096, out_features=1024, bias=True)
              (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            )
            (4): BartEncoderLayer(
              (self_attn): BartAttention(
                (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
              )
              (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
              (fc1): Linear(in_features=1024, out_features=4096, bias=True)
              (fc2): Linear(in_features=4096, out_features=1024, bias=True)
              (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            )
            (5): BartEncoderLayer(
              (self_attn): BartAttention(
                (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (v_proj): Linear(in_features=1024, out_features=1024, b… <trimmed 23463 bytes string>
  • /usr/local/lib/python3.8/site-packages/torch/nn/modules/module.py, line 530, in _apply
    1.         if buffer not in mod._buffers.values():
    2.             raise AttributeError("`" + buffer_name + "` is not a buffer")
    3.         return buffer
    4.     def _apply(self, fn):
    5.         for module in self.children():
    1.             module._apply(fn)
    1.         def compute_should_use_set_data(tensor, tensor_applied):
    2.             if torch._has_compatible_shallow_copy_type(tensor, tensor_applied):
    3.                 # If the new tensor has compatible tensor type as the existing tensor,
    4.                 # the current behavior is to change the tensor in-place using `.data =`,
    5.                 # and the future behavior is to overwrite the existing tensor. However,
    Variable Value
    fn
    <function Module.to.<locals>.convert at 0x7f7960363430>
    module
    BartModel(
      (shared): Embedding(50264, 1024, padding_idx=1)
      (encoder): BartEncoder(
        (embed_tokens): Embedding(50264, 1024, padding_idx=1)
        (embed_positions): BartLearnedPositionalEmbedding(1026, 1024)
        (layers): ModuleList(
          (0): BartEncoderLayer(
            (self_attn): BartAttention(
              (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
            )
            (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            (fc1): Linear(in_features=1024, out_features=4096, bias=True)
            (fc2): Linear(in_features=4096, out_features=1024, bias=True)
            (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          )
          (1): BartEncoderLayer(
            (self_attn): BartAttention(
              (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
            )
            (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            (fc1): Linear(in_features=1024, out_features=4096, bias=True)
            (fc2): Linear(in_features=4096, out_features=1024, bias=True)
            (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          )
          (2): BartEncoderLayer(
            (self_attn): BartAttention(
              (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
            )
            (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            (fc1): Linear(in_features=1024, out_features=4096, bias=True)
            (fc2): Linear(in_features=4096, out_features=1024, bias=True)
            (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          )
          (3): BartEncoderLayer(
            (self_attn): BartAttention(
              (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
            )
            (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            (fc1): Linear(in_features=1024, out_features=4096, bias=True)
            (fc2): Linear(in_features=4096, out_features=1024, bias=True)
            (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          )
          (4): BartEncoderLayer(
            (self_attn): BartAttention(
              (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
            )
            (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            (fc1): Linear(in_features=1024, out_features=4096, bias=True)
            (fc2): Linear(in_features=4096, out_features=1024, bias=True)
            (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          )
          (5): BartEncoderLayer(
            (self_attn): BartAttention(
              (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
            )
           … <trimmed 22574 bytes string>
    self
    BartForConditionalGeneration(
      (model): BartModel(
        (shared): Embedding(50264, 1024, padding_idx=1)
        (encoder): BartEncoder(
          (embed_tokens): Embedding(50264, 1024, padding_idx=1)
          (embed_positions): BartLearnedPositionalEmbedding(1026, 1024)
          (layers): ModuleList(
            (0): BartEncoderLayer(
              (self_attn): BartAttention(
                (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
              )
              (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
              (fc1): Linear(in_features=1024, out_features=4096, bias=True)
              (fc2): Linear(in_features=4096, out_features=1024, bias=True)
              (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            )
            (1): BartEncoderLayer(
              (self_attn): BartAttention(
                (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
              )
              (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
              (fc1): Linear(in_features=1024, out_features=4096, bias=True)
              (fc2): Linear(in_features=4096, out_features=1024, bias=True)
              (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            )
            (2): BartEncoderLayer(
              (self_attn): BartAttention(
                (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
              )
              (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
              (fc1): Linear(in_features=1024, out_features=4096, bias=True)
              (fc2): Linear(in_features=4096, out_features=1024, bias=True)
              (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            )
            (3): BartEncoderLayer(
              (self_attn): BartAttention(
                (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
              )
              (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
              (fc1): Linear(in_features=1024, out_features=4096, bias=True)
              (fc2): Linear(in_features=4096, out_features=1024, bias=True)
              (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            )
            (4): BartEncoderLayer(
              (self_attn): BartAttention(
                (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
              )
              (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
              (fc1): Linear(in_features=1024, out_features=4096, bias=True)
              (fc2): Linear(in_features=4096, out_features=1024, bias=True)
              (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            )
            (5): BartEncoderLayer(
              (self_attn): BartAttention(
                (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
                (v_proj): Linear(in_features=1024, out_features=1024, b… <trimmed 23463 bytes string>
  • /usr/local/lib/python3.8/site-packages/torch/nn/modules/module.py, line 530, in _apply
    1.         if buffer not in mod._buffers.values():
    2.             raise AttributeError("`" + buffer_name + "` is not a buffer")
    3.         return buffer
    4.     def _apply(self, fn):
    5.         for module in self.children():
    1.             module._apply(fn)
    1.         def compute_should_use_set_data(tensor, tensor_applied):
    2.             if torch._has_compatible_shallow_copy_type(tensor, tensor_applied):
    3.                 # If the new tensor has compatible tensor type as the existing tensor,
    4.                 # the current behavior is to change the tensor in-place using `.data =`,
    5.                 # and the future behavior is to overwrite the existing tensor. However,
    Variable Value
    fn
    <function Module.to.<locals>.convert at 0x7f7960363430>
    module
    BartDecoder(
      (embed_tokens): Embedding(50264, 1024, padding_idx=1)
      (embed_positions): BartLearnedPositionalEmbedding(1026, 1024)
      (layers): ModuleList(
        (0): BartDecoderLayer(
          (self_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (encoder_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (fc1): Linear(in_features=1024, out_features=4096, bias=True)
          (fc2): Linear(in_features=4096, out_features=1024, bias=True)
          (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        )
        (1): BartDecoderLayer(
          (self_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (encoder_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (fc1): Linear(in_features=1024, out_features=4096, bias=True)
          (fc2): Linear(in_features=4096, out_features=1024, bias=True)
          (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        )
        (2): BartDecoderLayer(
          (self_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (encoder_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (fc1): Linear(in_features=1024, out_features=4096, bias=True)
          (fc2): Linear(in_features=4096, out_features=1024, bias=True)
          (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        )
        (3): BartDecoderLayer(
          (self_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (encoder_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_pro… <trimmed 13420 bytes string>
    self
    BartModel(
      (shared): Embedding(50264, 1024, padding_idx=1)
      (encoder): BartEncoder(
        (embed_tokens): Embedding(50264, 1024, padding_idx=1)
        (embed_positions): BartLearnedPositionalEmbedding(1026, 1024)
        (layers): ModuleList(
          (0): BartEncoderLayer(
            (self_attn): BartAttention(
              (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
            )
            (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            (fc1): Linear(in_features=1024, out_features=4096, bias=True)
            (fc2): Linear(in_features=4096, out_features=1024, bias=True)
            (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          )
          (1): BartEncoderLayer(
            (self_attn): BartAttention(
              (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
            )
            (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            (fc1): Linear(in_features=1024, out_features=4096, bias=True)
            (fc2): Linear(in_features=4096, out_features=1024, bias=True)
            (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          )
          (2): BartEncoderLayer(
            (self_attn): BartAttention(
              (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
            )
            (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            (fc1): Linear(in_features=1024, out_features=4096, bias=True)
            (fc2): Linear(in_features=4096, out_features=1024, bias=True)
            (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          )
          (3): BartEncoderLayer(
            (self_attn): BartAttention(
              (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
            )
            (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            (fc1): Linear(in_features=1024, out_features=4096, bias=True)
            (fc2): Linear(in_features=4096, out_features=1024, bias=True)
            (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          )
          (4): BartEncoderLayer(
            (self_attn): BartAttention(
              (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
            )
            (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            (fc1): Linear(in_features=1024, out_features=4096, bias=True)
            (fc2): Linear(in_features=4096, out_features=1024, bias=True)
            (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          )
          (5): BartEncoderLayer(
            (self_attn): BartAttention(
              (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
            )
           … <trimmed 22574 bytes string>
  • /usr/local/lib/python3.8/site-packages/torch/nn/modules/module.py, line 530, in _apply
    1.         if buffer not in mod._buffers.values():
    2.             raise AttributeError("`" + buffer_name + "` is not a buffer")
    3.         return buffer
    4.     def _apply(self, fn):
    5.         for module in self.children():
    1.             module._apply(fn)
    1.         def compute_should_use_set_data(tensor, tensor_applied):
    2.             if torch._has_compatible_shallow_copy_type(tensor, tensor_applied):
    3.                 # If the new tensor has compatible tensor type as the existing tensor,
    4.                 # the current behavior is to change the tensor in-place using `.data =`,
    5.                 # and the future behavior is to overwrite the existing tensor. However,
    Variable Value
    fn
    <function Module.to.<locals>.convert at 0x7f7960363430>
    module
    ModuleList(
      (0): BartDecoderLayer(
        (self_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (encoder_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (fc1): Linear(in_features=1024, out_features=4096, bias=True)
        (fc2): Linear(in_features=4096, out_features=1024, bias=True)
        (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
      )
      (1): BartDecoderLayer(
        (self_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (encoder_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (fc1): Linear(in_features=1024, out_features=4096, bias=True)
        (fc2): Linear(in_features=4096, out_features=1024, bias=True)
        (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
      )
      (2): BartDecoderLayer(
        (self_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (encoder_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (fc1): Linear(in_features=1024, out_features=4096, bias=True)
        (fc2): Linear(in_features=4096, out_features=1024, bias=True)
        (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
      )
      (3): BartDecoderLayer(
        (self_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (encoder_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (fc1): Linear(in_features=1024, out_features=4096, bias… <trimmed 12735 bytes string>
    self
    BartDecoder(
      (embed_tokens): Embedding(50264, 1024, padding_idx=1)
      (embed_positions): BartLearnedPositionalEmbedding(1026, 1024)
      (layers): ModuleList(
        (0): BartDecoderLayer(
          (self_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (encoder_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (fc1): Linear(in_features=1024, out_features=4096, bias=True)
          (fc2): Linear(in_features=4096, out_features=1024, bias=True)
          (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        )
        (1): BartDecoderLayer(
          (self_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (encoder_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (fc1): Linear(in_features=1024, out_features=4096, bias=True)
          (fc2): Linear(in_features=4096, out_features=1024, bias=True)
          (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        )
        (2): BartDecoderLayer(
          (self_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (encoder_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (fc1): Linear(in_features=1024, out_features=4096, bias=True)
          (fc2): Linear(in_features=4096, out_features=1024, bias=True)
          (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        )
        (3): BartDecoderLayer(
          (self_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
          )
          (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          (encoder_attn): BartAttention(
            (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
            (q_pro… <trimmed 13420 bytes string>
  • /usr/local/lib/python3.8/site-packages/torch/nn/modules/module.py, line 530, in _apply
    1.         if buffer not in mod._buffers.values():
    2.             raise AttributeError("`" + buffer_name + "` is not a buffer")
    3.         return buffer
    4.     def _apply(self, fn):
    5.         for module in self.children():
    1.             module._apply(fn)
    1.         def compute_should_use_set_data(tensor, tensor_applied):
    2.             if torch._has_compatible_shallow_copy_type(tensor, tensor_applied):
    3.                 # If the new tensor has compatible tensor type as the existing tensor,
    4.                 # the current behavior is to change the tensor in-place using `.data =`,
    5.                 # and the future behavior is to overwrite the existing tensor. However,
    Variable Value
    fn
    <function Module.to.<locals>.convert at 0x7f7960363430>
    module
    BartDecoderLayer(
      (self_attn): BartAttention(
        (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
        (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
        (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
        (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
      )
      (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
      (encoder_attn): BartAttention(
        (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
        (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
        (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
        (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
      )
      (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
      (fc1): Linear(in_features=1024, out_features=4096, bias=True)
      (fc2): Linear(in_features=4096, out_features=1024, bias=True)
      (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
    )
    self
    ModuleList(
      (0): BartDecoderLayer(
        (self_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (encoder_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (fc1): Linear(in_features=1024, out_features=4096, bias=True)
        (fc2): Linear(in_features=4096, out_features=1024, bias=True)
        (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
      )
      (1): BartDecoderLayer(
        (self_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (encoder_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (fc1): Linear(in_features=1024, out_features=4096, bias=True)
        (fc2): Linear(in_features=4096, out_features=1024, bias=True)
        (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
      )
      (2): BartDecoderLayer(
        (self_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (encoder_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (fc1): Linear(in_features=1024, out_features=4096, bias=True)
        (fc2): Linear(in_features=4096, out_features=1024, bias=True)
        (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
      )
      (3): BartDecoderLayer(
        (self_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (encoder_attn): BartAttention(
          (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
          (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
        )
        (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (fc1): Linear(in_features=1024, out_features=4096, bias… <trimmed 12735 bytes string>
  • /usr/local/lib/python3.8/site-packages/torch/nn/modules/module.py, line 530, in _apply
    1.         if buffer not in mod._buffers.values():
    2.             raise AttributeError("`" + buffer_name + "` is not a buffer")
    3.         return buffer
    4.     def _apply(self, fn):
    5.         for module in self.children():
    1.             module._apply(fn)
    1.         def compute_should_use_set_data(tensor, tensor_applied):
    2.             if torch._has_compatible_shallow_copy_type(tensor, tensor_applied):
    3.                 # If the new tensor has compatible tensor type as the existing tensor,
    4.                 # the current behavior is to change the tensor in-place using `.data =`,
    5.                 # and the future behavior is to overwrite the existing tensor. However,
    Variable Value
    fn
    <function Module.to.<locals>.convert at 0x7f7960363430>
    module
    BartAttention(
      (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
      (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
      (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
      (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
    )
    self
    BartDecoderLayer(
      (self_attn): BartAttention(
        (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
        (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
        (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
        (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
      )
      (self_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
      (encoder_attn): BartAttention(
        (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
        (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
        (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
        (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
      )
      (encoder_attn_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
      (fc1): Linear(in_features=1024, out_features=4096, bias=True)
      (fc2): Linear(in_features=4096, out_features=1024, bias=True)
      (final_layer_norm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
    )
  • /usr/local/lib/python3.8/site-packages/torch/nn/modules/module.py, line 530, in _apply
    1.         if buffer not in mod._buffers.values():
    2.             raise AttributeError("`" + buffer_name + "` is not a buffer")
    3.         return buffer
    4.     def _apply(self, fn):
    5.         for module in self.children():
    1.             module._apply(fn)
    1.         def compute_should_use_set_data(tensor, tensor_applied):
    2.             if torch._has_compatible_shallow_copy_type(tensor, tensor_applied):
    3.                 # If the new tensor has compatible tensor type as the existing tensor,
    4.                 # the current behavior is to change the tensor in-place using `.data =`,
    5.                 # and the future behavior is to overwrite the existing tensor. However,
    Variable Value
    fn
    <function Module.to.<locals>.convert at 0x7f7960363430>
    module
    Linear(in_features=1024, out_features=1024, bias=True)
    self
    BartAttention(
      (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
      (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
      (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
      (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
    )
  • /usr/local/lib/python3.8/site-packages/torch/nn/modules/module.py, line 552, in _apply
    1.         for key, param in self._parameters.items():
    2.             if param is not None:
    3.                 # Tensors stored in modules are graph leaves, and we don't want to
    4.                 # track autograd history of `param_applied`, so we have to use
    5.                 # `with torch.no_grad():`
    6.                 with torch.no_grad():
    1.                     param_applied = fn(param)
    1.                 should_use_set_data = compute_should_use_set_data(param, param_applied)
    2.                 if should_use_set_data:
    3.                     param.data = param_applied
    4.                 else:
    5.                     assert isinstance(param, Parameter)
    6.                     assert param.is_leaf
    Variable Value
    compute_should_use_set_data
    <function Module._apply.<locals>.compute_should_use_set_data at 0x7f796040f8b0>
    fn
    <function Module.to.<locals>.convert at 0x7f7960363430>
    key
    'weight'
    param
    Parameter containing:
    tensor([[-0.0432, -0.0172,  0.0116,  ...,  0.0431, -0.0731,  0.1250],
            [-0.0366,  0.0531,  0.0582,  ...,  0.0604,  0.0161,  0.1752],
            [ 0.0065, -0.0547, -0.0356,  ...,  0.0239,  0.0537,  0.0665],
            ...,
            [ 0.0140,  0.0736,  0.0722,  ..., -0.0131,  0.0430, -0.0239],
            [ 0.0933,  0.1548,  0.0380,  ..., -0.0107, -0.0084, -0.0023],
            [ 0.0367,  0.0416, -0.0596,  ..., -0.1075,  0.1574,  0.0982]],
           requires_grad=True)
    self
    Linear(in_features=1024, out_features=1024, bias=True)
  • /usr/local/lib/python3.8/site-packages/torch/nn/modules/module.py, line 850, in convert
    1.                     "Please file an issue at https://github.com/pytorch/pytorch/issues/new?template=bug-report.md "
    2.                     "if a complex module does not work as expected.")
    3.         def convert(t):
    4.             if convert_to_format is not None and t.dim() in (4, 5):
    5.                 return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None,
    6.                             non_blocking, memory_format=convert_to_format)
    1.             return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
    1.         return self._apply(convert)
    2.     def register_backward_hook(
    3.         self, hook: Callable[['Module', _grad_t, _grad_t], Union[None, Tensor]]
    4.     ) -> RemovableHandle:
    Variable Value
    convert_to_format
    None
    device
    device(type='cuda')
    dtype
    None
    non_blocking
    False
    t
    Parameter containing:
    tensor([[-0.0432, -0.0172,  0.0116,  ...,  0.0431, -0.0731,  0.1250],
            [-0.0366,  0.0531,  0.0582,  ...,  0.0604,  0.0161,  0.1752],
            [ 0.0065, -0.0547, -0.0356,  ...,  0.0239,  0.0537,  0.0665],
            ...,
            [ 0.0140,  0.0736,  0.0722,  ..., -0.0131,  0.0430, -0.0239],
            [ 0.0933,  0.1548,  0.0380,  ..., -0.0107, -0.0084, -0.0023],
            [ 0.0367,  0.0416, -0.0596,  ..., -0.1075,  0.1574,  0.0982]],
           requires_grad=True)


Request information

USER

AnonymousUser

GET

Variable Value
min
'122'
max
'1000'

POST

No POST data

FILES

No FILES data

No cookie data

META

Variable Value
CONTENT_LENGTH
'2470'
CONTENT_TYPE
'text/plain; charset=utf-8'
HTTP_ACCEPT_ENCODING
'gzip, deflate, br'
HTTP_HOST
'192.168.1.145:4080'
PATH_INFO
'/getsummary/'
QUERY_STRING
'min=122&max=1000'
RAW_URI
'/getsummary/?min=122&max=1000'
REMOTE_ADDR
'192.168.1.190'
REMOTE_PORT
'55704'
REQUEST_METHOD
'POST'
SCRIPT_NAME
''
SERVER_NAME
'0.0.0.0'
SERVER_PORT
'4080'
SERVER_PROTOCOL
'HTTP/1.1'
SERVER_SOFTWARE
'gunicorn/20.0.4'
gunicorn.socket
<socket.socket fd=9, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=0, laddr=('172.17.0.2', 4080), raddr=('192.168.1.190', 55704)>
wsgi.errors
<gunicorn.http.wsgi.WSGIErrorsWrapper object at 0x7f7953fb39d0>
wsgi.file_wrapper
<class 'gunicorn.http.wsgi.FileWrapper'>
wsgi.input
<gunicorn.http.body.Body object at 0x7f7953fb3b80>
wsgi.input_terminated
True
wsgi.multiprocess
False
wsgi.multithread
False
wsgi.run_once
False
wsgi.url_scheme
'http'
wsgi.version
(1, 0)

Settings

Using settings module web_project.settings

Setting Value
ABSOLUTE_URL_OVERRIDES
{}
ADMINS
[]
ALLOWED_HOSTS
['*']
APPEND_SLASH
True
AUTHENTICATION_BACKENDS
['django.contrib.auth.backends.ModelBackend']
AUTH_PASSWORD_VALIDATORS
'********************'
AUTH_USER_MODEL
'auth.User'
BASE_DIR
PosixPath('/app')
CACHES
{'default': {'BACKEND': 'django.core.cache.backends.locmem.LocMemCache'}}
CACHE_MIDDLEWARE_ALIAS
'default'
CACHE_MIDDLEWARE_KEY_PREFIX
'********************'
CACHE_MIDDLEWARE_SECONDS
600
CSRF_COOKIE_AGE
31449600
CSRF_COOKIE_DOMAIN
None
CSRF_COOKIE_HTTPONLY
False
CSRF_COOKIE_NAME
'csrftoken'
CSRF_COOKIE_PATH
'/'
CSRF_COOKIE_SAMESITE
'Lax'
CSRF_COOKIE_SECURE
False
CSRF_FAILURE_VIEW
'django.views.csrf.csrf_failure'
CSRF_HEADER_NAME
'HTTP_X_CSRFTOKEN'
CSRF_TRUSTED_ORIGINS
[]
CSRF_USE_SESSIONS
False
DATABASES
{'default': {'ATOMIC_REQUESTS': False,
             'AUTOCOMMIT': True,
             'CONN_MAX_AGE': 0,
             'ENGINE': 'django.db.backends.sqlite3',
             'HOST': '',
             'NAME': PosixPath('/app/db.sqlite3'),
             'OPTIONS': {},
             'PASSWORD': '********************',
             'PORT': '',
             'TEST': {'CHARSET': None,
                      'COLLATION': None,
                      'MIGRATE': True,
                      'MIRROR': None,
                      'NAME': None},
             'TIME_ZONE': None,
             'USER': ''}}
DATABASE_ROUTERS
[]
DATA_UPLOAD_MAX_MEMORY_SIZE
2621440
DATA_UPLOAD_MAX_NUMBER_FIELDS
1000
DATETIME_FORMAT
'N j, Y, P'
DATETIME_INPUT_FORMATS
['%Y-%m-%d %H:%M:%S',
 '%Y-%m-%d %H:%M:%S.%f',
 '%Y-%m-%d %H:%M',
 '%m/%d/%Y %H:%M:%S',
 '%m/%d/%Y %H:%M:%S.%f',
 '%m/%d/%Y %H:%M',
 '%m/%d/%y %H:%M:%S',
 '%m/%d/%y %H:%M:%S.%f',
 '%m/%d/%y %H:%M']
DATE_FORMAT
'N j, Y'
DATE_INPUT_FORMATS
['%Y-%m-%d',
 '%m/%d/%Y',
 '%m/%d/%y',
 '%b %d %Y',
 '%b %d, %Y',
 '%d %b %Y',
 '%d %b, %Y',
 '%B %d %Y',
 '%B %d, %Y',
 '%d %B %Y',
 '%d %B, %Y']
DEBUG
True
DEBUG_PROPAGATE_EXCEPTIONS
False
DECIMAL_SEPARATOR
'.'
DEFAULT_AUTO_FIELD
'django.db.models.BigAutoField'
DEFAULT_CHARSET
'utf-8'
DEFAULT_EXCEPTION_REPORTER
'django.views.debug.ExceptionReporter'
DEFAULT_EXCEPTION_REPORTER_FILTER
'django.views.debug.SafeExceptionReporterFilter'
DEFAULT_FILE_STORAGE
'django.core.files.storage.FileSystemStorage'
DEFAULT_FROM_EMAIL
'webmaster@localhost'
DEFAULT_HASHING_ALGORITHM
'sha256'
DEFAULT_INDEX_TABLESPACE
''
DEFAULT_TABLESPACE
''
DISALLOWED_USER_AGENTS
[]
EMAIL_BACKEND
'django.core.mail.backends.smtp.EmailBackend'
EMAIL_HOST
'localhost'
EMAIL_HOST_PASSWORD
'********************'
EMAIL_HOST_USER
''
EMAIL_PORT
25
EMAIL_SSL_CERTFILE
None
EMAIL_SSL_KEYFILE
'********************'
EMAIL_SUBJECT_PREFIX
'[Django] '
EMAIL_TIMEOUT
None
EMAIL_USE_LOCALTIME
False
EMAIL_USE_SSL
False
EMAIL_USE_TLS
False
FILE_UPLOAD_DIRECTORY_PERMISSIONS
None
FILE_UPLOAD_HANDLERS
['django.core.files.uploadhandler.MemoryFileUploadHandler',
 'django.core.files.uploadhandler.TemporaryFileUploadHandler']
FILE_UPLOAD_MAX_MEMORY_SIZE
2621440
FILE_UPLOAD_PERMISSIONS
420
FILE_UPLOAD_TEMP_DIR
None
FIRST_DAY_OF_WEEK
0
FIXTURE_DIRS
[]
FORCE_SCRIPT_NAME
None
FORMAT_MODULE_PATH
None
FORM_RENDERER
'django.forms.renderers.DjangoTemplates'
IGNORABLE_404_URLS
[]
INSTALLED_APPS
['django.contrib.admin',
 'django.contrib.auth',
 'django.contrib.contenttypes',
 'django.contrib.sessions',
 'django.contrib.messages',
 'django.contrib.staticfiles']
INTERNAL_IPS
[]
LANGUAGES
[('af', 'Afrikaans'),
 ('ar', 'Arabic'),
 ('ar-dz', 'Algerian Arabic'),
 ('ast', 'Asturian'),
 ('az', 'Azerbaijani'),
 ('bg', 'Bulgarian'),
 ('be', 'Belarusian'),
 ('bn', 'Bengali'),
 ('br', 'Breton'),
 ('bs', 'Bosnian'),
 ('ca', 'Catalan'),
 ('cs', 'Czech'),
 ('cy', 'Welsh'),
 ('da', 'Danish'),
 ('de', 'German'),
 ('dsb', 'Lower Sorbian'),
 ('el', 'Greek'),
 ('en', 'English'),
 ('en-au', 'Australian English'),
 ('en-gb', 'British English'),
 ('eo', 'Esperanto'),
 ('es', 'Spanish'),
 ('es-ar', 'Argentinian Spanish'),
 ('es-co', 'Colombian Spanish'),
 ('es-mx', 'Mexican Spanish'),
 ('es-ni', 'Nicaraguan Spanish'),
 ('es-ve', 'Venezuelan Spanish'),
 ('et', 'Estonian'),
 ('eu', 'Basque'),
 ('fa', 'Persian'),
 ('fi', 'Finnish'),
 ('fr', 'French'),
 ('fy', 'Frisian'),
 ('ga', 'Irish'),
 ('gd', 'Scottish Gaelic'),
 ('gl', 'Galician'),
 ('he', 'Hebrew'),
 ('hi', 'Hindi'),
 ('hr', 'Croatian'),
 ('hsb', 'Upper Sorbian'),
 ('hu', 'Hungarian'),
 ('hy', 'Armenian'),
 ('ia', 'Interlingua'),
 ('id', 'Indonesian'),
 ('ig', 'Igbo'),
 ('io', 'Ido'),
 ('is', 'Icelandic'),
 ('it', 'Italian'),
 ('ja', 'Japanese'),
 ('ka', 'Georgian'),
 ('kab', 'Kabyle'),
 ('kk', 'Kazakh'),
 ('km', 'Khmer'),
 ('kn', 'Kannada'),
 ('ko', 'Korean'),
 ('ky', 'Kyrgyz'),
 ('lb', 'Luxembourgish'),
 ('lt', 'Lithuanian'),
 ('lv', 'Latvian'),
 ('mk', 'Macedonian'),
 ('ml', 'Malayalam'),
 ('mn', 'Mongolian'),
 ('mr', 'Marathi'),
 ('my', 'Burmese'),
 ('nb', 'Norwegian Bokmål'),
 ('ne', 'Nepali'),
 ('nl', 'Dutch'),
 ('nn', 'Norwegian Nynorsk'),
 ('os', 'Ossetic'),
 ('pa', 'Punjabi'),
 ('pl', 'Polish'),
 ('pt', 'Portuguese'),
 ('pt-br', 'Brazilian Portuguese'),
 ('ro', 'Romanian'),
 ('ru', 'Russian'),
 ('sk', 'Slovak'),
 ('sl', 'Slovenian'),
 ('sq', 'Albanian'),
 ('sr', 'Serbian'),
 ('sr-latn', 'Serbian Latin'),
 ('sv', 'Swedish'),
 ('sw', 'Swahili'),
 ('ta', 'Tamil'),
 ('te', 'Telugu'),
 ('tg', 'Tajik'),
 ('th', 'Thai'),
 ('tk', 'Turkmen'),
 ('tr', 'Turkish'),
 ('tt', 'Tatar'),
 ('udm', 'Udmurt'),
 ('uk', 'Ukrainian'),
 ('ur', 'Urdu'),
 ('uz', 'Uzbek'),
 ('vi', 'Vietnamese'),
 ('zh-hans', 'Simplified Chinese'),
 ('zh-hant', 'Traditional Chinese')]
LANGUAGES_BIDI
['he', 'ar', 'ar-dz', 'fa', 'ur']
LANGUAGE_CODE
'en-us'
LANGUAGE_COOKIE_AGE
None
LANGUAGE_COOKIE_DOMAIN
None
LANGUAGE_COOKIE_HTTPONLY
False
LANGUAGE_COOKIE_NAME
'django_language'
LANGUAGE_COOKIE_PATH
'/'
LANGUAGE_COOKIE_SAMESITE
None
LANGUAGE_COOKIE_SECURE
False
LOCALE_PATHS
[]
LOGGING
{}
LOGGING_CONFIG
'logging.config.dictConfig'
LOGIN_REDIRECT_URL
'/accounts/profile/'
LOGIN_URL
'/accounts/login/'
LOGOUT_REDIRECT_URL
None
MANAGERS
[]
MEDIA_ROOT
''
MEDIA_URL
'/'
MESSAGE_STORAGE
'django.contrib.messages.storage.fallback.FallbackStorage'
MIDDLEWARE
['django.middleware.security.SecurityMiddleware',
 'django.contrib.sessions.middleware.SessionMiddleware',
 'django.middleware.common.CommonMiddleware',
 'django.middleware.csrf.CsrfViewMiddleware',
 'django.contrib.auth.middleware.AuthenticationMiddleware',
 'django.contrib.messages.middleware.MessageMiddleware',
 'django.middleware.clickjacking.XFrameOptionsMiddleware']
MIGRATION_MODULES
{}
MONTH_DAY_FORMAT
'F j'
NUMBER_GROUPING
0
PASSWORD_HASHERS
'********************'
PASSWORD_RESET_TIMEOUT
'********************'
PASSWORD_RESET_TIMEOUT_DAYS
'********************'
PREPEND_WWW
False
ROOT_URLCONF
'web_project.urls'
SECRET_KEY
'********************'
SECURE_BROWSER_XSS_FILTER
False
SECURE_CONTENT_TYPE_NOSNIFF
True
SECURE_HSTS_INCLUDE_SUBDOMAINS
False
SECURE_HSTS_PRELOAD
False
SECURE_HSTS_SECONDS
0
SECURE_PROXY_SSL_HEADER
None
SECURE_REDIRECT_EXEMPT
[]
SECURE_REFERRER_POLICY
'same-origin'
SECURE_SSL_HOST
None
SECURE_SSL_REDIRECT
False
SERVER_EMAIL
'root@localhost'
SESSION_CACHE_ALIAS
'default'
SESSION_COOKIE_AGE
1209600
SESSION_COOKIE_DOMAIN
None
SESSION_COOKIE_HTTPONLY
True
SESSION_COOKIE_NAME
'sessionid'
SESSION_COOKIE_PATH
'/'
SESSION_COOKIE_SAMESITE
'Lax'
SESSION_COOKIE_SECURE
False
SESSION_ENGINE
'django.contrib.sessions.backends.db'
SESSION_EXPIRE_AT_BROWSER_CLOSE
False
SESSION_FILE_PATH
None
SESSION_SAVE_EVERY_REQUEST
False
SESSION_SERIALIZER
'django.contrib.sessions.serializers.JSONSerializer'
SETTINGS_MODULE
'web_project.settings'
SHORT_DATETIME_FORMAT
'm/d/Y P'
SHORT_DATE_FORMAT
'm/d/Y'
SIGNING_BACKEND
'django.core.signing.TimestampSigner'
SILENCED_SYSTEM_CHECKS
[]
STATICFILES_DIRS
[]
STATICFILES_FINDERS
['django.contrib.staticfiles.finders.FileSystemFinder',
 'django.contrib.staticfiles.finders.AppDirectoriesFinder']
STATICFILES_STORAGE
'django.contrib.staticfiles.storage.StaticFilesStorage'
STATIC_ROOT
None
STATIC_URL
'/static/'
TEMPLATES
[{'APP_DIRS': True,
  'BACKEND': 'django.template.backends.django.DjangoTemplates',
  'DIRS': [],
  'OPTIONS': {'context_processors': ['django.template.context_processors.debug',
                                     'django.template.context_processors.request',
                                     'django.contrib.auth.context_processors.auth',
                                     'django.contrib.messages.context_processors.messages']}}]
TEST_NON_SERIALIZED_APPS
[]
TEST_RUNNER
'django.test.runner.DiscoverRunner'
THOUSAND_SEPARATOR
','
TIME_FORMAT
'P'
TIME_INPUT_FORMATS
['%H:%M:%S', '%H:%M:%S.%f', '%H:%M']
TIME_ZONE
'UTC'
USE_I18N
True
USE_L10N
True
USE_THOUSAND_SEPARATOR
False
USE_TZ
True
USE_X_FORWARDED_HOST
False
USE_X_FORWARDED_PORT
False
WSGI_APPLICATION
'web_project.wsgi.application'
X_FRAME_OPTIONS
'DENY'
YEAR_MONTH_FORMAT
'F Y'

You're seeing this error because you have DEBUG = True in your Django settings file. Change that to False, and Django will display a standard page generated by the handler for this status code.


IN THIS ARTICLE