In the Bayes approach the model-data summary is arguably the observed likelihood function; in the frequentist approach it is arguably a -value function assessing a least squares or maximum-likelihood departure; and in the higher- order likelihood approach it is the observed likelihood together with a canonical reparameterization. For likelihood the obvious method of combining is to add log-likelihoods from independent sources: this is in the nature of likelihood itself and is also an implicit Bayes imperative as only likelihood is used in the Bayesian argument. For the familiar frequentist approach the combining of -values is of- ten ad hoc: we discuss first a Fisher proposal and then offer a likelihood based alternative. For the higher order likelihood approach the combining begins with the standard summary, which is likelihood plus a canonical reparameterization: we develop the appropriate higher order combining procedure. For the -value summary, Fisher (1973) proposed a quick and easy method for combining -values from independent investigations: multiply them together and use chi-square ta- bles. The proposal received criticism that it did not address power and other conventional criteria, but Fisher had assumed quite clearly that such related in- formation was unavailable. We use rst order likelihood theory to derive a simple modication: the p-values are converted to likelihood values and the likelihood values to observed likelihood functions and these in turn to a new composite p-value. Higher order likelihood offers further refinement: use the standard sum- mary involving the log-likelihood () and the canonical reparameterization (); the combining from independent investigations then amounts to adding the ob- served log-likelihoods () and weighting and adding the reparameterizations (). We develop this information combining procedure: add log-likelihoods and suitably weight and add canonical parameters. Some examples follow. 44n1_2