↓

“The Other in game theory…is the fictive ideal of one who makes infallible calculations. He is a pure subject of the signifier, in the sense of doing nothing other than fitting in with this signifier, in contrast to the subject of the unconscious, who, being qua drive still bound to objects and in this way always referring at least to a remainder of the real, is always only an ‘impure’ subject of the signifier. A pure subject of the signifier perhaps only exists in a purely theoretically defined moment—that of the infinitisation or infinitization of the subject, when the ur-signifier is pure non-sense and does not let any unconscious meaning arise in the subject, but completely abolishes this meaning (Sé XI 227/252). Only this ‘case’ does not seem to be so eccentric, since Lacan immediately afterward formulates the project of achieving, through formalization, “the mediation of this infinity of the subject with the finiteness of desire” (Sé XI 228/252). So Lacan too is concerned with the relation between an infinite and a finite, even if in a very different way from Levinas. Here we are dealing, after all, with the infinity of a subject—which, however, can only come into existence in the first place because the subject is called on to realize himself in the field of the Other.”

↓

“Hal Stern once told me that the real dividing line in statistics is not Bayesian vs. frequentist but rather model-based vs. non-model-based. Another way I’ve heard it is, generative vs. non-generative models. […] [One] way to distinguish the two approaches to statistics is not in terms of what they do but rather what they’d like to do. Consider two inferential strategies:
1. Set up a strong model with many assumptions. The ultimate goal of the statistical analysis might be to reject the model and replace it with something better. In the terminology of Kuhn (1969), Bayesian inference conditional on the model is ‘normal science’, and rejection through posterior predictive checking is the stuff of ‘scientific revolutions’. The key here is: the stronger the model, the more directly it can be falsified. I view this as a unification of the Popperian and Kuhnian philosophies of science; for the present discussion, what is relevant is that, in this framework, it is a positive feature of a model to have strong assumptions.
2. Assume no model or a partial model or only a model for the data collection, not the data-generating process. Methods here include second-order inference, proportional hazards models, various signal processing approaches (for example, wavelet shrinkage and lasso regression), and the jackknife and bootstrap. The key idea here is: any model will certainly be wrong, so it’s best to anticipate this and develop robust methods that perform well without strong assumptions.
[T]he mainstream of econometrics seems to me to follow the second approach, but Heckman (2007) argues the opposite, that it is through strong assumptions that we learn about social science. Biomedical statistics features, at one extreme, elaborate multi-compartment pharmacological models and, at the other, generalized estimating equations that fit models to hierarchical data structures while avoiding modeling the individual data points. In the long run there may be a synthesis of highly complex models that have many of the features of nonparametric approaches (for example, the additive regression trees of Chipman, George, and McCulloch, 2005)…”

↓

“History reflects a gradual ‘externalization’ of measurement in terms of Carnap’s terminology (1950): the development of measurement instruments is initially for ‘internal questions’ and moves gradually towards ‘external questions’. For example, parameters are internal within models, whereas the existence of models is external with respect to the parameters. Econometric research has moved from the issue of how to optimally estimate parameters to the harder issue of how to measure and hence evaluate the efficiency, fruitfulness and simplicity of the models, i.e. the relevance of models as measuring instruments.”

↓

“[I]f a count of correct answers or a sum of ratings can provide a meaningful basis for invariant, additive quantification, then a Rasch model holds. Even when data are not evaluated for fit to a Rasch model, even when the invariance and additivity properties of quantitative measurement are ignored, use of test, survey, or assessment scores as though they are [unweighted] measures inherently implies acceptance of Rasch’s separability theorem.
This is because the parameter separation theorem is nothing less than a formal representation of the rigorous independence of figure and meaning, or of name and concept (Fisher, 2003a, 2003b, 2004b), that must be assumed in any communication, even in the discourses of deconstruction (Ricoeur, 1977, p. 293; Derrida, 1982, p. 229; Derrida, 1989, p. 218; Gasché, 1987, p. 5). Though not obvious on first blush, postmodern philosophy has multiple points of contact and potentially productive associations to be found in mathematics (Tasić, 2001). Rasch’s mathematics, for instance, make tests of the qualitative hypothesis of quantitative meaningfulness (Narens, 2002) more accessible and practical than most work in this area. And in so doing, it taps deeply into the history of measurement and deploys rich possibilities for mathematical thinking that remain largely unexplored (Wright, 1988, 1997).”

↓

“Irving Fisher is widely known for what is called a separation theorem (I. Fisher, 1930, ch. 6-8). The basic principle is fundamentally the same as Rasch’s separability theorem, but with an economic twist. The theorem separates managerial opportunities for productivity from entrepreneurial market opportunities. The point is that a firm’s basic objective is the maximization of its current value, no matter what the investment preferences & financing sources of the owners happen to be
The Fisher Separation Theorem posits that investment budgeting decisions are made in a two-stage process. First, entrepreneurial capital investment decisions are held to be independent of the preferences of the owner, and second, the investment decision is independent of the financing decision. The story told by these relations became the basis of neoclassical macroeconomic theory, and each of them could be written as a multifaceted Rasch model (Linacre, 1989)”