If you have a small number of pages that you want Google to update, the best way is to log into Google Search Console and use the fetch as Google feature for those URLs. This is a manual process so it isn't going to work for a large number of pages. If you change your pages again, you will have to do it over.
Other than that, you can improve your website to make it crawler friendly:
Reduce the size of your pages
Googlebot's quota for how many pages it downloads from your site each day in part depends on how long it takes to download them. Reducing their size can really help.
Minify your HTML, CSS, and JS. Optimize your images. Don't use data:uri inline images for Googlebot. Remove unused code. Remove sections that users don't read.
Improve your server performance
Similarly, Googlebot will download more if your server is faster. Pay attention to "time to first byte" and try to optimize that. The longer your server is "thinking" about a request before starting to return it to the user, the more time Googlebot is just waiting around.
Find a reputable and fast web host. Enable caching where appropriate. If you have coded your site yourself, profile your code and figure out what is slow.
Eliminate duplicate URLs
If Googlebot is crawling the same content at two different URLs, that is time and energy it isn't spending checking for updates on other pages.
Make sure you link only to your canonical URLs. Implement 301 permanent redirects for alternate URLs.
Increase your site's reputation
Googlebot still uses PageRank to determine your site's crawl budget. It also uses pages' individual PageRanks to determine how often to fetch certain pages.
Getting additional external links to your site is the only way to increase your PageRank. Your site will naturally attract links as it ages. Consider creating link bait content, creating widgets to encourage linking, or finding a few places where you can recommend your own site through links.
Update your content frequently
If you change the content of a page often, Googlebot will recrawl that particular page more often. For example if your home page lists your most recent posts, Google will come back to your home page more frequently than it will check on more static articles that haven't changed in months.
Some people will suggest sitemaps and feeds. Those don't work to get your site crawled more frequently. Google ignores the changefreq and lastmod fields in sitemaps because many webmasters don't put accurate data in them. Sitemaps don't usually help with getting your site crawled or indexed better. See The Sitemap Paradox.