default search action
Arnulf Jentzen
Person information
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [j32]Sonja Cox, Arnulf Jentzen, Felix Lindner:
Weak convergence rates for temporal numerical approximations of the semilinear stochastic wave equation with multiplicative noise. Numerische Mathematik 156(6): 2131-2177 (2024) - [i65]Arnulf Jentzen, Adrian Riekert:
Non-convergence to global minimizers for Adam and stochastic gradient descent optimization and constructions of local minimizers in the training of artificial neural networks. CoRR abs/2402.05155 (2024) - [i64]Julia Ackermann, Arnulf Jentzen, Benno Kuckuck, Joshua Lee Padgett:
Deep neural networks with ReLU, leaky ReLU, and softplus activation provably overcome the curse of dimensionality for space-time solutions of semilinear partial differential equations. CoRR abs/2406.10876 (2024) - [i63]Steffen Dereich, Arnulf Jentzen, Adrian Riekert:
Learning rate adaptive stochastic gradient descent optimization methods: numerical simulations for deep learning methods for partial differential equations and convergence analyses. CoRR abs/2406.14340 (2024) - [i62]Steffen Dereich, Robin Graeber, Arnulf Jentzen:
Non-convergence of Adam and other adaptive stochastic gradient descent optimization methods for non-vanishing learning rates. CoRR abs/2407.08100 (2024) - [i61]Steffen Dereich, Arnulf Jentzen:
Convergence rates for the Adam optimizer. CoRR abs/2407.21078 (2024) - [i60]Lukas Gonon, Arnulf Jentzen, Benno Kuckuck, Siyu Liang, Adrian Riekert, Philippe von Wurstemberger:
An Overview on Machine Learning Methods for Partial Differential Equations: from Physics Informed Neural Networks to Deep Operator Learning. CoRR abs/2408.13222 (2024) - [i59]Sonja Hannibal, Arnulf Jentzen, Do Minh Thang:
Non-convergence to global minimizers in data driven supervised deep learning: Adam and stochastic gradient descent optimization provably fail to converge to global minimizers in the training of deep neural networks with ReLU activation. CoRR abs/2410.10533 (2024) - 2023
- [j31]Philipp Grohs, Fabian Hornung, Arnulf Jentzen, Philipp Zimmermann:
Space-time error estimates for deep neural network approximations for differential equations. Adv. Comput. Math. 49(1): 4 (2023) - [j30]Arnulf Jentzen, Timo Welti:
Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation. Appl. Math. Comput. 455: 127907 (2023) - [j29]Christian Beck, Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, Ariel Neufeld:
An efficient Monte Carlo scheme for Zakai equations. Commun. Nonlinear Sci. Numer. Simul. 126: 107438 (2023) - [j28]Philipp Grohs, Shokhrukh Ibragimov, Arnulf Jentzen, Sarah Koppensteiner:
Lower bounds for artificial neural network approximations: A proof that shallow neural networks fail to overcome the curse of dimensionality. J. Complex. 77: 101746 (2023) - [j27]Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse, Tuan Anh Nguyen:
Overcoming the curse of dimensionality in the numerical approximation of backward stochastic differential equations. J. Num. Math. 31(1): 1-28 (2023) - [i58]Lukas Gonon, Robin Graeber, Arnulf Jentzen:
The necessity of depth for artificial neural networks to approximate certain classes of smooth and bounded functions without the curse of dimensionality. CoRR abs/2301.08284 (2023) - [i57]Arnulf Jentzen, Adrian Riekert, Philippe von Wurstemberger:
Algorithmically Designed Artificial Neural Networks (ADANNs): Higher order deep operator learning for parametric partial differential equations. CoRR abs/2302.03286 (2023) - [i56]Steffen Dereich, Arnulf Jentzen, Sebastian Kassing:
On the existence of minimizers in shallow residual ReLU neural network optimization landscapes. CoRR abs/2302.14690 (2023) - [i55]Julia Ackermann, Arnulf Jentzen, Thomas Kruse, Benno Kuckuck, Joshua Lee Padgett:
Deep neural networks with ReLU, leaky ReLU, and softplus activation provably overcome the curse of dimensionality for Kolmogorov partial differential equations with Lipschitz nonlinearities in the Lp-sense. CoRR abs/2309.13722 (2023) - [i54]Arnulf Jentzen, Benno Kuckuck, Philippe von Wurstemberger:
Mathematical Introduction to Deep Learning: Methods, Implementations, and Theory. CoRR abs/2310.20360 (2023) - 2022
- [j26]Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse:
Overcoming the Curse of Dimensionality in the Numerical Approximation of Parabolic Partial Differential Equations with Gradient-Dependent Nonlinearities. Found. Comput. Math. 22(4): 905-966 (2022) - [j25]Patrick Cheridito, Arnulf Jentzen, Adrian Riekert, Florian Rossmannek:
A proof of convergence for gradient descent in the training of artificial neural networks for constant target functions. J. Complex. 72: 101646 (2022) - [j24]Arnulf Jentzen, Adrian Riekert:
A proof of convergence for the gradient descent optimization method with random initializations in the training of neural networks with ReLU activation for piecewise linear target functions. J. Mach. Learn. Res. 23: 260:1-260:50 (2022) - [j23]Patrick Cheridito, Arnulf Jentzen, Florian Rossmannek:
Landscape Analysis for Shallow Neural Networks: Complete Classification of Critical Points for Affine Target Functions. J. Nonlinear Sci. 32(5): 64 (2022) - [j22]Patrick Cheridito, Arnulf Jentzen, Florian Rossmannek:
Efficient Approximation of High-Dimensional Functions With Neural Networks. IEEE Trans. Neural Networks Learn. Syst. 33(7): 3079-3093 (2022) - [i53]Sebastian Becker, Arnulf Jentzen, Marvin S. Müller, Philippe von Wurstemberger:
Learning the random variables in Monte Carlo simulations with stochastic gradient descent: Machine learning for parametric PDEs and financial derivative pricing. CoRR abs/2202.02717 (2022) - [i52]Victor Boussange, Sebastian Becker, Arnulf Jentzen, Benno Kuckuck, Loïc Pellissier:
Deep learning approximations for non-local nonlinear PDEs with Neumann boundary conditions. CoRR abs/2205.03672 (2022) - [i51]Arnulf Jentzen, Timo Kröger:
On bounds for norms of reparameterized ReLU artificial neural network parameters: sums of fractional powers of the Lipschitz norm control the network parameter vector. CoRR abs/2206.13646 (2022) - [i50]Simon Eberle, Arnulf Jentzen, Adrian Riekert, Georg S. Weiss:
Normalized gradient flow optimization in the training of ReLU artificial neural networks. CoRR abs/2207.06246 (2022) - [i49]Patrick Cheridito, Arnulf Jentzen, Florian Rossmannek:
Gradient descent provably escapes saddle points in the training of shallow ReLU networks. CoRR abs/2208.02083 (2022) - [i48]Christian Beck, Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, Ariel Neufeld:
An efficient Monte Carlo scheme for Zakai equations. CoRR abs/2210.13530 (2022) - 2021
- [j21]Arnulf Jentzen, Ryan Kurniawan:
Weak Convergence Rates for Euler-Type Approximations of Semilinear Stochastic Evolution Equations with Nonlinear Diffusion Coefficients. Found. Comput. Math. 21(2): 445-536 (2021) - [j20]Patrick Cheridito, Arnulf Jentzen, Florian Rossmannek:
Non-convergence of stochastic gradient descent in the training of deep neural networks. J. Complex. 64: 101540 (2021) - [j19]Christian Beck, Sebastian Becker, Philipp Grohs, Nor Jaafari, Arnulf Jentzen:
Solving the Kolmogorov PDE by Means of Deep Learning. J. Sci. Comput. 88(3): 73 (2021) - [j18]Christian Beck, Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, Ariel Neufeld:
Deep Splitting Method for Parabolic PDEs. SIAM J. Sci. Comput. 43(5): A3135-A3154 (2021) - [i47]Patrick Cheridito, Arnulf Jentzen, Adrian Riekert, Florian Rossmannek:
A proof of convergence for gradient descent in the training of artificial neural networks for constant target functions. CoRR abs/2102.09924 (2021) - [i46]Arnulf Jentzen, Timo Kröger:
Convergence rates for gradient descent in the training of overparameterized artificial neural networks with biases. CoRR abs/2102.11840 (2021) - [i45]Christian Beck, Martin Hutzenthaler, Arnulf Jentzen, Emilia Magnani:
Full history recursive multilevel Picard approximations for ordinary differential equations with expectations. CoRR abs/2103.02350 (2021) - [i44]Philipp Grohs, Shokhrukh Ibragimov, Arnulf Jentzen, Sarah Koppensteiner:
Lower bounds for artificial neural network approximations: A proof that shallow neural networks fail to overcome the curse of dimensionality. CoRR abs/2103.04488 (2021) - [i43]Patrick Cheridito, Arnulf Jentzen, Florian Rossmannek:
Landscape analysis for shallow ReLU neural networks: complete classification of critical points for affine target functions. CoRR abs/2103.10922 (2021) - [i42]Arnulf Jentzen, Adrian Riekert:
A proof of convergence for stochastic gradient descent in the training of artificial neural networks with ReLU activation for constant target functions. CoRR abs/2104.00277 (2021) - [i41]Arnulf Jentzen, Adrian Riekert:
Convergence analysis for gradient flows in the training of artificial neural networks with ReLU activation. CoRR abs/2107.04479 (2021) - [i40]Arnulf Jentzen, Adrian Riekert:
A proof of convergence for the gradient descent optimization method with random initializations in the training of neural networks with ReLU activation for piecewise linear target functions. CoRR abs/2108.04620 (2021) - [i39]Simon Eberle, Arnulf Jentzen, Adrian Riekert, Georg S. Weiss:
Existence, uniqueness, and convergence rates for gradient flows in the training of artificial neural networks with ReLU activation. CoRR abs/2108.08106 (2021) - [i38]Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse, Tuan Anh Nguyen:
Overcoming the curse of dimensionality in the numerical approximation of backward stochastic differential equations. CoRR abs/2108.10602 (2021) - [i37]Martin Hutzenthaler, Arnulf Jentzen, Benno Kuckuck, Joshua Lee Padgett:
Strong Lp-error analysis of nonlinear Monte Carlo approximations for high-dimensional semilinear partial differential equations. CoRR abs/2110.08297 (2021) - [i36]Martin Hutzenthaler, Arnulf Jentzen, Katharina Pohl, Adrian Riekert, Luca Scarpa:
Convergence proof for stochastic gradient descent in the training of deep neural networks with ReLU activation for constant target functions. CoRR abs/2112.07369 (2021) - [i35]Arnulf Jentzen, Adrian Riekert:
On the existence of global minima and convergence analyses for gradient descent methods in the training of deep neural networks. CoRR abs/2112.09684 (2021) - [i34]Pierfrancesco Beneventano, Patrick Cheridito, Robin Graeber, Arnulf Jentzen, Benno Kuckuck:
Deep neural network approximation theory for high-dimensional functions. CoRR abs/2112.14523 (2021) - 2020
- [j17]Arnulf Jentzen, Philippe von Wurstemberger:
Lower error bounds for the stochastic gradient descent optimization algorithm: Sharp convergence rates for slowly and fast decaying learning rates. J. Complex. 57: 101438 (2020) - [j16]Benjamin J. Fehrman, Benjamin Gess, Arnulf Jentzen:
Convergence Rates for the Stochastic Gradient Descent Method for Non-Convex Objective Functions. J. Mach. Learn. Res. 21: 136:1-136:48 (2020) - [j15]Christian Beck, Fabian Hornung, Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse:
Overcoming the curse of dimensionality in the numerical approximation of Allen-Cahn partial differential equations via truncated full-history recursive multilevel Picard approximations. J. Num. Math. 28(4): 197-222 (2020) - [j14]Arnulf Jentzen, Felix Lindner, Primoz Pusnik:
Exponential moment bounds and strong convergence rates for tamed-truncated numerical approximations of stochastic convolutions. Numer. Algorithms 85(4): 1447-1473 (2020) - [j13]Julius Berner, Philipp Grohs, Arnulf Jentzen:
Analysis of the Generalization Error: Empirical Risk Minimization over Deep Artificial Neural Networks Overcomes the Curse of Dimensionality in the Numerical Approximation of Black-Scholes Partial Differential Equations. SIAM J. Math. Data Sci. 2(3): 631-657 (2020) - [i33]Christian Beck, Lukas Gonon, Arnulf Jentzen:
Overcoming the curse of dimensionality in the numerical approximation of high-dimensional semilinear elliptic partial differential equations. CoRR abs/2003.00596 (2020) - [i32]Arnulf Jentzen, Timo Welti:
Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation. CoRR abs/2003.01291 (2020) - [i31]Sebastian Becker, Ramon Braunwarth, Martin Hutzenthaler, Arnulf Jentzen, Philippe von Wurstemberger:
Numerical simulations for full history recursive multilevel Picard approximations for systems of high-dimensional partial differential equations. CoRR abs/2005.10206 (2020) - [i30]Fabian Hornung, Arnulf Jentzen, Diyora Salimova:
Space-time deep neural network approximations for high-dimensional partial differential equations. CoRR abs/2006.02199 (2020) - [i29]Patrick Cheridito, Arnulf Jentzen, Florian Rossmannek:
Non-convergence of stochastic gradient descent in the training of deep neural networks. CoRR abs/2006.07075 (2020) - [i28]Aritz Bercher, Lukas Gonon, Arnulf Jentzen, Diyora Salimova:
Weak error analysis for stochastic gradient descent optimization algorithms. CoRR abs/2007.02723 (2020) - [i27]Weinan E, Jiequn Han, Arnulf Jentzen:
Algorithms for Solving High Dimensional PDEs: From Nonlinear Monte Carlo to Machine Learning. CoRR abs/2008.13333 (2020) - [i26]Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse, Tuan Anh Nguyen:
Multilevel Picard approximations for high-dimensional semilinear second-order PDEs with Lipschitz nonlinearities. CoRR abs/2009.02484 (2020) - [i25]Christian Beck, Arnulf Jentzen, Thomas Kruse:
Nonlinear Monte Carlo methods with polynomial runtime for high-dimensional iterated nested expectations. CoRR abs/2009.13989 (2020) - [i24]Christian Beck, Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, Ariel Neufeld:
Deep learning based numerical approximation algorithms for stochastic partial differential equations and high-dimensional nonlinear filtering problems. CoRR abs/2012.01194 (2020) - [i23]Pierfrancesco Beneventano, Patrick Cheridito, Arnulf Jentzen, Philippe von Wurstemberger:
High-dimensional approximation spaces of artificial neural networks and applications to partial differential equations. CoRR abs/2012.04326 (2020) - [i22]Arnulf Jentzen, Adrian Riekert:
Strong overall error analysis for the training of artificial neural networks via random initializations. CoRR abs/2012.08443 (2020) - [i21]Christian Beck, Martin Hutzenthaler, Arnulf Jentzen, Benno Kuckuck:
An overview on deep learning-based approximation methods for partial differential equations. CoRR abs/2012.12348 (2020)
2010 – 2019
- 2019
- [j12]Mario Hefter, Arnulf Jentzen:
On arbitrarily slow convergence rates for strong numerical approximations of Cox-Ingersoll-Ross processes and squared Bessel processes. Finance Stochastics 23(1): 139-172 (2019) - [j11]Sebastian Becker, Patrick Cheridito, Arnulf Jentzen:
Deep Optimal Stopping. J. Mach. Learn. Res. 20: 74:1-74:25 (2019) - [j10]Christian Beck, Weinan E, Arnulf Jentzen:
Machine Learning Approximation Algorithms for High-Dimensional Fully Nonlinear Partial Differential Equations and Second-order Backward Stochastic Differential Equations. J. Nonlinear Sci. 29(4): 1563-1619 (2019) - [j9]Weinan E, Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse:
On Multilevel Picard Numerical Approximations for High-Dimensional Nonlinear Parabolic Partial Differential Equations and High-Dimensional Nonlinear Backward Stochastic Differential Equations. J. Sci. Comput. 79(3): 1534-1571 (2019) - [i20]Benjamin J. Fehrman, Benjamin Gess, Arnulf Jentzen:
Convergence rates for the stochastic gradient descent method for non-convex objective functions. CoRR abs/1904.01517 (2019) - [i19]Julius Berner, Dennis Elbrächter, Philipp Grohs, Arnulf Jentzen:
Towards a regularity theory for ReLU networks - chain rule and global error estimates. CoRR abs/1905.04992 (2019) - [i18]Christian Beck, Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, Ariel Neufeld:
Deep splitting method for parabolic PDEs. CoRR abs/1907.03452 (2019) - [i17]Christian Beck, Fabian Hornung, Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse:
Overcoming the curse of dimensionality in the numerical approximation of Allen-Cahn partial differential equations via truncated full-history recursive multilevel Picard approximations. CoRR abs/1907.06729 (2019) - [i16]Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, Timo Welti:
Solving high-dimensional optimal stopping problems using deep learning. CoRR abs/1908.01602 (2019) - [i15]Philipp Grohs, Fabian Hornung, Arnulf Jentzen, Philipp Zimmermann:
Space-time error estimates for deep neural network approximations for differential equations. CoRR abs/1908.03833 (2019) - [i14]Philipp Grohs, Arnulf Jentzen, Diyora Salimova:
Deep neural network approximations for Monte Carlo algorithms. CoRR abs/1908.10828 (2019) - [i13]Christian Beck, Arnulf Jentzen, Benno Kuckuck:
Full error analysis for the training of deep neural networks. CoRR abs/1910.00121 (2019) - [i12]Martin Hutzenthaler, Arnulf Jentzen, Felix Lindner, Primoz Pusnik:
Strong convergence rates on the whole probability space for space-time discrete numerical approximation schemes for stochastic Burgers equations. CoRR abs/1911.01870 (2019) - [i11]Michael B. Giles, Arnulf Jentzen, Timo Welti:
Generalised multilevel Picard approximations. CoRR abs/1911.03188 (2019) - [i10]Lukas Gonon, Philipp Grohs, Arnulf Jentzen, David Kofler, David Siska:
Uniform error estimates for artificial neural network approximations for heat equations. CoRR abs/1911.09647 (2019) - [i9]Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse:
Overcoming the curse of dimensionality in the numerical approximation of parabolic partial differential equations with gradient-dependent nonlinearities. CoRR abs/1912.02571 (2019) - [i8]Patrick Cheridito, Arnulf Jentzen, Florian Rossmannek:
Efficient approximation of high-dimensional functions with deep neural networks. CoRR abs/1912.04310 (2019) - 2018
- [j8]Martin Hutzenthaler, Arnulf Jentzen, Xiaojie Wang:
Exponential integrability properties of numerical approximation processes for nonlinear stochastic differential equations. Math. Comput. 87(311): 1353-1413 (2018) - [i7]Christian Beck, Sebastian Becker, Philipp Grohs, Nor Jaafari, Arnulf Jentzen:
Solving stochastic differential equations and Kolmogorov equations by means of deep learning. CoRR abs/1806.00421 (2018) - [i6]Philipp Grohs, Fabian Hornung, Arnulf Jentzen, Philippe von Wurstemberger:
A proof that artificial neural networks overcome the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations. CoRR abs/1809.02362 (2018) - [i5]Julius Berner, Philipp Grohs, Arnulf Jentzen:
Analysis of the generalization error: Empirical risk minimization over deep artificial neural networks overcomes the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations. CoRR abs/1809.03062 (2018) - [i4]Arnulf Jentzen, Diyora Salimova, Timo Welti:
A proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients. CoRR abs/1809.07321 (2018) - 2017
- [i3]Weinan E, Jiequn Han, Arnulf Jentzen:
Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations. CoRR abs/1706.04702 (2017) - [i2]Jiequn Han, Arnulf Jentzen, Weinan E:
Overcoming the curse of dimensionality: Solving high-dimensional partial differential equations using deep learning. CoRR abs/1707.02568 (2017) - [i1]Christian Beck, Weinan E, Arnulf Jentzen:
Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations. CoRR abs/1709.05963 (2017) - 2016
- [j7]Sebastian Becker, Arnulf Jentzen, Peter E. Kloeden:
An Exponential Wagner-Platen Type Scheme for SPDEs. SIAM J. Numer. Anal. 54(4): 2389-2426 (2016) - 2015
- [j6]Arnulf Jentzen, Michael Röckner:
A Milstein Scheme for SPDEs. Found. Comput. Math. 15(2): 313-362 (2015) - 2013
- [j5]Dirk Blömker, Arnulf Jentzen:
Galerkin Approximations for the Stochastic Burgers Equation. SIAM J. Numer. Anal. 51(1): 694-715 (2013) - 2011
- [j4]Martin Hutzenthaler, Arnulf Jentzen:
Convergence of the Stochastic Euler Scheme for Locally Lipschitz Coefficients. Found. Comput. Math. 11(6): 657-706 (2011) - [j3]Arnulf Jentzen:
Higher Order Pathwise Numerical Approximations of SPDEs with Additive Noise. SIAM J. Numer. Anal. 49(2): 642-667 (2011) - 2010
- [j2]Arnulf Jentzen, Frank Leber, Daniela Schneisgen, Arno Berger, Stefan Siegmund:
An Improved Maximum Allowable Transfer Interval for Lp -Stability of Networked Control Systems. IEEE Trans. Autom. Control. 55(1): 179-184 (2010)
2000 – 2009
- 2009
- [j1]Arnulf Jentzen, Peter E. Kloeden, Andreas Neuenkirch:
Pathwise approximation of stochastic differential equations on domains: higher order convergence rates without global Lipschitz coefficients. Numerische Mathematik 112(1): 41-64 (2009)
Coauthor Index
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2024-12-09 22:11 CET by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint