[under development]
o artificial language vs. common (natural) language
o philosophical questions vs. philosophical answers (!)
Philosophical questions are often to be cast in normal terms. To put them in technical terms may often lead to inadequacy! (Example: the coherence theory 'of truth')
Philosophical answers are supposed to be as precise as possible. This may often require technical terminology, or careful explication of common terms. (Example of 'illocutionary act': no common term available!)
The vagueness of common notions is very often neglected. Thus, for example, a large number of philosophers (Searle, Alston, Bach & Harnish, Pagin, and many more) lead debates about the question if 'statements' are (a) conventional acts, or (b) attempts at communication, or (c) acts of saying, or (d) acts of achieved communication, or ... whatnot -- while in fact the term statement is so vague as to admit of its application in each of these different kinds of act, and is not restricted to any single of them.
o strength and weakness of artificial language in asking/answering philosophical questions
o The questions we find interesting can be expected to be construable in normal terms, rather than in technical term.
o For example, the question, 'What is truth?', does not aim at any of the various technical definitions offered by philosophers (like, for example, the coherence theory, or the consensus theory), but clearly aims simply at the normal notion (appropriately construable along the lines of the correspondence theory of truth, as Tarski (1944) rightly emphasises.)
o strength and weakness of common language in asking/answering philosophical questions
o Normal language may lack terms of the precision required to express an adequate answer to a philosophical question. In this case, it is useful to introduce new (technical) terms.
o For example, the notion of a 'brain state' may be useful to answer questions about the nature of mind and psychic states, even though this term may be held not to be part of normal language (let's assume it is not).