[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
nep-cmp New Economics Papers
on Computational Economics
Issue of 2018‒07‒16
nine papers chosen by



  1. Financial Risk and Returns Prediction with Modular Networked Learning By Carlos Pedro Gon\c{c}alves
  2. Structural Labour Supply Models and Microsimulation By Aaberge, Rolf; Colombino, Ugo
  3. Non-linear Time Series and Artificial Neural Networks of Red Hat Volatility By Jos\'e Igor Morlanes
  4. Machine Learning for Yield Curve Feature Extraction: Application to Illiquid Corporate Bonds (Preliminary Draft) By Greg Kirczenow; Ali Fathi; Matt Davison
  5. Sustainability of the pension system in Macedonia: A comprehensive analysis and reform proposal with MK-PENS - dynamic microsimulation model By Blagica Petreski; Pavle Gacov
  6. Rough but not so Tough: Fast Hybrid Schemes for Fractional Riccati Equations By Callegaro Giorgia; Grasselli Martino; Pag\`es Gilles
  7. A Multi-Criteria Financial and Energy Portfolio Analysis of Hedge Fund Strategies By Allen, D.E.; McAleer, M.J.; Singh, A.K.
  8. Long-time large deviations for the multi-asset Wishart stochastic volatility model and option pricing By Aur\'elien Alfonsi; David Krief; Peter Tankov
  9. Classifying occupations using web-based job advertisements: an application to STEM and creative occupations By Antonio Lima; Hasan Bakhshi

  1. By: Carlos Pedro Gon\c{c}alves
    Abstract: An artificial agent for financial risk and returns' prediction is built with a modular cognitive system comprised of interconnected recurrent neural networks, such that the agent learns to predict the financial returns, and learns to predict the squared deviation around these predicted returns. These two expectations are used to build a volatility-sensitive interval prediction for financial returns, which is evaluated on three major financial indices and shown to be able to predict financial returns with higher than 80% success rate in interval prediction in both training and testing, raising into question the Efficient Market Hypothesis. The agent is introduced as an example of a class of artificial intelligent systems that are equipped with a Modular Networked Learning cognitive system, defined as an integrated networked system of machine learning modules, where each module constitutes a functional unit that is trained for a given specific task that solves a subproblem of a complex main problem expressed as a network of linked subproblems. In the case of neural networks, these systems function as a form of an "artificial brain", where each module is like a specialized brain region comprised of a neural network with a specific architecture.
    Date: 2018–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1806.05876&r=cmp
  2. By: Aaberge, Rolf (Statistics Norway); Colombino, Ugo (University of Turin)
    Abstract: The purpose of the paper is to provide a discussion of the various approaches for accounting for labour supply responses in microsimulation models. The paper focuses attention on two methodologies for modelling labour supply: the discrete choice model and the random utility – random opportunities model. The paper then describes approaches to utilising these models for policy simulation in terms of producing and interpreting simulation outcomes, outlining an extensive literature of policy analyses utilising these approach. Labour supply models are not only central for analyzing behavioural labour supply responses but also for identifying optimal tax-benefit systems, given some of the challenges of the theoretical approach. Combining labour supply results with individual and social welfare functions enables the social evaluation of policy simulations. Combining welfare functions and labour supply functions, the paper discusses how to model socially optimal income taxation.
    Keywords: behavioural microsimulation, labour supply, discrete choice, tax reforms
    JEL: C50 D10 D31 H21 H24 H31 J20
    Date: 2018–05
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp11562&r=cmp
  3. By: Jos\'e Igor Morlanes
    Abstract: We extend the empirical results published in article "Empirical Evidence on Arbitrage by Changing the Stock Exchange" by means of machine learning and advanced econometric methodologies based on Smooth Transition Regression models and Artificial Neural Networks.
    Date: 2018–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1806.01070&r=cmp
  4. By: Greg Kirczenow; Ali Fathi; Matt Davison
    Abstract: This paper studies the application of machine learning in extracting the market implied features from historical risk neutral corporate bond yields. We consider the example of a hypothetical illiquid fixed income market. After choosing a surrogate liquid market, we apply the Denoising Autoencoder algorithm from the field of computer vision and pattern recognition to learn the features of the missing yield parameters from the historically implied data of the instruments traded in the chosen liquid market. The results of the trained machine learning algorithm are compared with the outputs of a point in- time 2 dimensional interpolation algorithm known as the Thin Plate Spline. Finally, the performances of the two algorithms are compared.
    Date: 2018–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1806.01731&r=cmp
  5. By: Blagica Petreski; Pavle Gacov
    Date: 2018–03
    URL: http://d.repec.org/n?u=RePEc:ftm:policy:2018-03/14&r=cmp
  6. By: Callegaro Giorgia; Grasselli Martino; Pag\`es Gilles
    Abstract: We solve a family of fraction Riccati differential equations with constant (possibly complex) coefficients. These equations arise, e.g., in fractional Heston stochastic volatility models when computing the characteristic function of the log-spot price, hence in a pricing perspective. We first consider the case of a zero initial value corresponding to the characteristic function of the log-price. Then we investigate the case of a general starting value associated to a transform also involving the volatility process. The solution to the fractional Riccati equation takes the form of power series, whose coefficients satisfy a convolution equation. We show that this solution has a positive convergence domain, which is typically finite. Our theoretical results naturally suggest a numerical algorithm to explicitly obtain the solution to quadratic ordinary differential equations of fractional type, that turns out to be quite promising in terms of computational performance. In particular, we introduce a hybrid numerical scheme, we test its precision by comparing our results with the (few) benchmark available in the literature, based on the Adams method, and we numerically study the convergence of our procedure.
    Date: 2018–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1805.12587&r=cmp
  7. By: Allen, D.E.; McAleer, M.J.; Singh, A.K.
    Abstract: The paper is concerned with a multi-criteria portfolio analysis of hedge fund strategies that are concerned with nancial commodities, including the possibility of energy spot, futures and exchange traded funds (ETF). It features a tri-criteria analysis of the Eurekahedge fund data strategy index data. We use nine Eurekahedge equally weighted main strategy indices for the portfolio analysis. The tri-criteria analysis features three objectives: return, risk and dispersion of risk objectives in a Multi-Criteria Optimisation (MCO) portfolio analysis. We vary the MCO return and risk targets, and contrast the results with four more standard portfolio optimisation criteria, namely tangency portfolio (MSR), most diversied portfolio (MDP), global minimum variance portfolio (GMW), and portfolios based on minimising expected shortfall (ERC). Backtests of the chosen portfolios for this hedge fund data set indicate that the use of MCO is accompanied by uncertainty about the a priori choice of optimal parameter settings for the decision criteria. The empirical results do not appear to outperform more standard bi-criteria portfolio analyses in the backtests undertaken on the hedge fund index data.
    Keywords: MCO, Portfolio Analysis, Hedge Fund Strategies, Multi-Criteria Optimisation, Genetic Algorithms, Spot prices, Futures pricees, Exchange Traded Funds (ETF)
    JEL: G15 G17 G32 C58 D53
    Date: 2018–06–11
    URL: http://d.repec.org/n?u=RePEc:ems:eureir:109055&r=cmp
  8. By: Aur\'elien Alfonsi; David Krief; Peter Tankov
    Abstract: We prove a large deviations principle for the class of multidimensional affine stochastic volatility models considered in (Gourieroux, C. and Sufana, R., J. Bus. Econ. Stat., 28(3), 2010), where the volatility matrix is modelled by a Wishart process. This class extends the very popular Heston model to the multivariate setting, thus allowing to model the joint behaviour of a basket of stocks or several interest rates. We then use the large deviation principle to obtain an asymptotic approximation for the implied volatility of basket options and to develop an asymptotically optimal importance sampling algorithm, to reduce the number of simulations when using Monte-Carlo methods to price derivatives.
    Date: 2018–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1806.06883&r=cmp
  9. By: Antonio Lima; Hasan Bakhshi
    Abstract: Rapid technological, social and economic change is having significant impacts on the nature of jobs. In fast-changing environments it is crucial that policymakers have a clear and timely picture of the labour market. Policymakers use standardised occupational classifications, such as the Office for National Statistics’ Standard Occupational Classification (SOC) in the UK to analyse the labour market. These permit the occupational composition of the workforce to be tracked on a consistent and transparent basis over time and across industrial sectors. However, such systems are by their nature costly to maintain, slow to adapt and not very flexible. For that reason, additional tools are needed. At the same time, policymakers over the world are revisiting how active skills development policies can be used to equip workers with the capabilities needed to meet the new labour market realities. There is in parallel a desire for more granular understandings of what skills combinations are required of occupations, in part so that policymakers are better sighted on how individuals can redeploy these skills as and when employer demands change further. In this paper, we investigate the possibility of complementing traditional occupational classifications with more flexible methods centred around employers’ characterisations of the skills and knowledge requirements of occupations as presented in job advertisements. We use data science methods to classify job advertisements as STEM or non-STEM (Science, Technology, Engineering and Mathematics) and creative or non-creative, based on the content of ads in a database of UK job ads posted online belonging to Boston-based job market analytics company, Burning Glass Technologies. In doing so, we first characterise each SOC code in terms of its skill make-up; this step allows us to describe each SOC skillset as a mathematical object that can be compared with other skillsets. Then we develop a classifier that predicts the SOC code of a job based on its required skills. Finally, we develop two classifiers that decide whether a job vacancy is STEM/non-STEM and creative/non-creative, based again on its skill requirements.
    Keywords: labour demand, occupational classification, online job adverts, big data, machine learning, STEM, STEAM, creative economy
    JEL: C18 J23 J24
    Date: 2018–07
    URL: http://d.repec.org/n?u=RePEc:nsr:escoed:escoe-dp-2018-07&r=cmp

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.