To serve content from your custom domain, you'll need FTP access to upload a file to your current hosting storage. This is how Google verifies that you own the domain.
Once you've verified your domain, you create a storage bucket named the same as your domain (e.g., www.mycustomdomainname.com) and then upload your files to the storage bucket. There are two ways to upload content, you can use the dead easy Storage Browser web interface, or you can use gsutil. gsutil is a Python app that enables command line access to GCP.
Even if you use the web UI to upload your files, you'll probably still want to use gsutil for some additional actions. For example, to set files as public, you have to go into every folder using the UI (it's labourious to say the least), but gsutil provides a simple command to accomplish the same thing extremely quickly. This one command recursively sets every file in the bucket to public read: gsutil -m acl set -R -a public-read gs://bucketname
After you've got your files uploaded and set to public, you can test the links from the Storage Browser and then your bucket is ready to go. The last step is to go to your domain registrar's site and add a CNAME entry for GCP storage. For example, www.mydomainname.com points to c.storage.googleapis.com. After adding the alias, you can verify it's working by using the free CNAME lookup tool on http://mxtoolbox.com.
That's it! You're now saving money and your files are being managed by someone else. Very nice.
Note that if you're moving from Windows hosting to GCP, you might be switching from an IIS web server on Windows to a Linux backend. This means that your new Linux driven URLs will be case sensitive. This actually broke some of my image links because I hadn't matched the case of the file path exactly in the HTML and IIS doesn't care about such details.