I've been a fan of Derek Bickerton's writing and thinking on linguistics since happening upon Language and Species in a Philadelphia bookstore, disturbingly many decades ago. More Than Nature Needs, the latest addition to Bickerton's canon, is an intriguing and worthy one, and IMO is considerably deeper than its predecessor Adam's Tongue.
Adam's Tongue argues that the elements of human symbolic language likely emerged via scavenging behavior, as this was an early case in which early humans would have needed to systematically refer to situations not within the common physical enviroment of the speaker and hearer. This is an interesting speculation, showcasing Bickerton's inventiveness as a lateral thinker. MTNN continues in this vein, exploring the ways in which language may have emerged from simplistic proto-language. However, MTNN draws more extensively on Bickerton's expertise as a linguist, and hence ends up being more profoundly thought-provoking and incisive.
As I see it, the core point of MTNN -- rephrased into my own terminology somewhat -- is that the developmental trajectory from proto-language to fully grammatical, proper language should be viewed as a combination of natural-selection and cultural/psychological self-organization. To simplify a bit: Natural selection gave humans the core of language, the abstract "universal grammar" (UG) which underlies all human languages and is in some way wired into the brain; whereas cultural/psychological self-organization took us the rest of the way from universal grammar to actual specific languages.
The early stages of the book spend a bunch of time arguing against a purely learning-oriented view of language organization, stressing the case that some sort of innate, evolved universal grammar capability does exist. But the UG Bickerton favors is a long way from classic-Chomskian Principles and Parameters -- it is more of an abstract set of word-organization patterns, which requires lots of individual and cultural creativity to get turned into a language.
I suspect the view he presents is basically correct. I am not sure it's quite as novel as the author proposes; a review in Biolinguistics cites some literature where others present similar perspectives. In a broader sense, the mix of selection-based and self-organization-based ideas reminded me of the good old cognitive science book Rethinking Innateness (and lots of other stuff written in that same vein since). However, Bickerton presents his ideas far more accessibly and entertainingly than the typical academic paper, and provides interesting stories and specifics going along with the abstractions.
He also bolsters his perspective via relating it to the study of creoles and pidgins, an area in which he has done extensive linguistics research over many decades. He presents an intriguing argument that children can create a creole (a true language) in a single generation, building on the pidgins used by their parents and the other adults around them. I can't assess this aspect of his argument carefully, as I'm not much of a creologist (creologian??), but it's fascinating to read. There is ingenuity in the general approach of investigating creole language formation as a set of examples of recent-past language creation.
The specific linguistics examples in the book are given in a variant of Chomskian linguistics (i.e. generative grammar), in which a deep and surface structure are distinguished, and it's assumed that grammar involves "moving" of words from their positions in the deep structure to their new positions in the surface structure. Here I tend to differ from Bickerton. Ray Jackendoff and others have made heroic efforts to modernize generative grammar and connect it with cognitive science and neuroscience, but in the end, I'm still not convinced it's a great paradigm for linguistic analysis. I much more favor Dick Hudson's Word Grammar approach to grammatical formalization (which will not be surprising to anyone familiar with my work, as Word Grammar's theory of cognitive linguistics is similar to aspects of the OpenCog AGI architecture that I am now helping develop; and Word Grammar is fairly similar to the link grammar that is currently used within OpenCog).
Word Grammar also has a deep vs. surface structure dichotomy - but the deep structure is a sort of semantic graph. In a Word Grammar version of the core hypothesis of MTNN, the evolved UG would be a semantic graph framework for organizing words and concepts, plus a few basic constraints for linearizing graphs into series of words (e.g. landmark transitivity, for the 3 Word Grammar geeks reading this). But the lexicon, along with various other particular linearization constraints dealing with odd cases, would emerge culturally and be learned by individuals.
(If I were rich and had more free time, I'd organize some sort of linguistics pow-wow on one of my private islands, and invite Bickerton and Hudson to brainstorm together with me for a few weeks; as I really think Word Grammar would suit Bickerton's psycholinguistic perspective much better than the quasi-Chomskian approach he now favors.)
But anyhow, stepping back from deep-dive scientific quibbles: I think MTNN is very well worth reading for anyone interested in language and its evolution. Some of the technical bits will be slow going for readers unfamiliar with technical linguistics -- but this is only a small percentage of the book, and most of it reads very smoothly and entertainingly in the classic Derek Bickerton style. Soo ... highly recommended!