You probably wouldn’t with a Pro but you might between an iPad Pro and an MacBook Air.
With the foundation models API they basically said that there will be one size of model for the entire platform, making smarter models on a MacBook Pro unrealistic and only faster ones possible.
Isn't Private Cloud Compute already enabling the more powerful models to be run on the server? That way the on-device models don't have as much pressure to be The One.