Ken,
There is a lot of detail in what needs to be discussed to come to understanding on these issues.
You describe a research and theory making tradition that I understand. I've worked that way. I understand and can use the approach of gathering data to enable generalisation of unambigiously stated theory. I'm suggesting it isn't the only way.
You don't seem to understand that there are other ways developed an in use. There are already established tools that can be repurposed towards being used to predict outcomes of design such as the theories for optimum design of experiments. Think of the process of making a design as a design in itself and that it is also an experiment. One standard approach is described in http://areaestadistica.uclm.es/s3ed/Documents/jlfidalgo.pdf
Alternatively you could go the route of system dynamics of which there are already many pre-existing models to predict future social changes resulting from the introduction of a designed intervention. The method for making the underlying theory for these predictive methods (which is sparse on data use) is as I described in my last post.
Alternatively again you could make a similar theory basis for Discrete Simulation as used, e.g. in predicting consequences of changes in health systems processes.
Note, both involve assembling chunks of pre-existing well tested theory.
A difference between System Dynamics and Discrete Simulation theory development is whether it is macro or micro-based. Making that choice is derived from the sort of question I asked about continuous or discrete representation of colour.
In either case, the data/evidence/ empirical requirement is many times smaller than that of deriving classical theory or atheoretical theory and in most cases can be pulled from existing research in other areas without collecting new data. For an example (in this case in the realm of anti-terrorism design) try http://www.systemdynamics.org/conferences/2010/proceed/papers/P1276.pdf and look at how little data is needed for the SD approach - see Table 2. Both approaches, however use much less data than the classical scientific method you advocate.
One way to think of it, if you are obsessed with the data-theory connection and are ignoring Popper's falsification theory, is that 1) the justification has already been done for all the elemental theories of which a larger theory model is composed (and hence that data is not needed) ; and 2) by testing the internal theory integrity of the model combining the justified theories you are significantly adding to the justification/confidence of the whole (reducing the overall data requirement further). I can't do the stats analysis in my head while writing, but I suggest it is very doable. The combined model offers a larger theory that if the original causal analyses are correct will behave in a similar (as in mathematically similar) manner to the real world situation it represents. At that point, the only reasons you have for needing data are a) to check you haven't made mistakes in combining the theories; b) that key characteristics of the behaviour align with those of the real situation; and c) aligning the scale of the theory to the real situation. A) and b) usually use the same small dataset, and c) requires either minimal data or again sues the same data set as a) and b).
Neither is a new approach to research and theory making, they have been available for around 50 years.
On medical research, what you say is correct, using the traditional medical research approachesa costs too much and takes too long. That is why medical research (except of drugs) is rapidly transitioning to the approaches I have described. As you seem to object to me offering relevant links, instead I suggest you Google " System Dynamics NIH' to get some of the examples of how the National Institute of Health is increasingly using system dynamics predictive approaches instead of traditional evidence to theory research methods.
Regards,
Terence
---
Dr Terence Love
PhD(UWA), BA(Hons) Engin. PGCEd, FDRS, PMACM, MISI
Love Services Pty Ltd
PO Box 226, Quinns Rocks
Western Australia 6030
Tel: +61 (0)4 3497 5848
[log in to unmask]
www.loveservices.com.au
--
-----Original Message-----
From: [log in to unmask] [mailto:[log in to unmask]] On Behalf Of Ken Friedman
Sent: Sunday, 21 February 2016 7:40 PM
To: PhD-Design <[log in to unmask]>
Subject: Re: Predictive Models
The problem with medical research is expense and time — it costs too much, it takes too long, and much of what we learn only helps a few patients. Progress is slow and painful.
If, on the other hands, people took your approach in medical research, medicine would not be making slow, painful progress. There would be no progress at all. When people simply borrow inappropriate mathematical models from other fields, models can work whether or not patients die.
Terry Love wrote:
—snip—
In case you haven't come across it....
A standard method of creating predictive models of complex situations with unknown behaviours is to identify causal relationships, create the model, then test the model for boundary conditions and behaviour over time correspondence, then identify what (sparse) data is needed to calibrate the model, then, and only then, collect the small amount of calibration data and undertake the calibration. Prior to identifying causal relationships is to address practical aspects of the model/theory structure and what is manageable and useful. This enable pre-theory/modelling decisions to be made. This prior analysis is where my question was aimed.
-----------------------------------------------------------------
PhD-Design mailing list <[log in to unmask]>
Discussion of PhD studies and related research in Design
Subscribe or Unsubscribe at https://www.jiscmail.ac.uk/phd-design
-----------------------------------------------------------------
|