> The increasing specialization of the academic world and fields of
> research is a consequence of the ever growing body of knowledge being
> produced since the Renaissance - no one can be generalists anymore. I
> suspect the same to be true in the realm of praxis; think of the vast
> array of knowledge an industrial designer has to (or should) possess
> today (everything from e.g. IT to LCA). I think Leonardo would
> encounter serious trouble and be baffled at his own incompetence if he
> were to embark on product development for, say, Sony today.
Your comment touches on an interesting observation that can be made in
the area of computer programming. Some argue that the "body of
knowledge" within software is growing rapidly; there are hundreds of
different programming languages and incredibly intricate libraries of
pre-existing work that can be drawn upon.
On the other hand, there have been (near as I can make out) only about
6 major advances in the mental models associated with programming.
While there are many syntaxes and the mixings of different aspects of
these models, in my experience they all can be reduced to a
(relatively) simple few basic premises and assumptions. Taking it even
further, there are (I believe) only two major computer architectures
that a developer will interact with, and going to real extremes, it can
be argued that they are all Turing machines in the end.
In my work and school experience, those who were able to look past the
syntax and understand the underlying mental models were the ones most
likely to succeed, both academically and in the workforce. In
particular, when exposed to a new programming language they quickly
determine its "profile" and from then on all they need is a thesaurus
or Rosetta stone analogue. We had a motto in my department "Give us 2
weeks and we'll be 80% to anywhere".
As an (aspiring) educator, I find myself torn between the desire to
teach these "essentials" of programming and the apparent desire on the
part of students (and business) for skills in a particular language.
How does this all link in to phd-design and to the previous message?
My belief (and that's all it is thus far) is that a good designer will
be a good designer regardless of their field of application. They may
need some time to understand the particular details of a new area
(materials, marketing, aesthetic sensibilities, etc.) but given a
little time they will be able to perform, and perform well. I will
happily concede that for exacting details they may have to turn to
others with particular domain-specific skills.
So to bring this back to design research, my assertion is that research
into design should operate at the
conceptual/ontological/philosophical/epistemological/rhetorical level.
Such research can then be applied (praxis?) wherever needed through a
tailoring suitable to the area/discipline/project/team. I see this
assertion as analogous to stating that in design research, design must
have "primacy" and the area of application, while vital, must remain
"secondary".
> That being said - scientists still paint, and designers still invent.
:)
Jason
|