神刀安全网

Getting started with Red Hat OpenShift on Google Cloud Platform

We recently announced that Red Hat’s container platform OpenShift Dedicated will run on Google Cloud Platform , letting you hook up your OpenShift clusters to the full portfolio of Google Cloud services. So what’s the best way to get started?

We recommend deploying a Kubernetes-based solution . In the example below, we’ll analyze incoming tweets using Google Cloud Pub/Sub (Google’s fully-managed real-time messaging service that allows you to send and receive messages between independent applications) and Google BigQuery (Google’s fully managed, no-ops, low cost analytics database). This can be the starting point for incorporating social insights into your own services.

Step 0:  If you don’t have a GCP account already, please sign-up for Cloud Platform , setup billing and activate APIs.

Step 1:  Next you’ll setup a service account. A service account is a way to interact with your GCP resources by using a different identity than your primary login and is generally intended for server-to-server interaction. From the GCP Navigation Menu, click on "Permissions."

Getting started with Red Hat OpenShift on Google Cloud Platform

Once there, click on "Service accounts."

Getting started with Red Hat OpenShift on Google Cloud Platform

Click on "Create service account," which will prompt you to enter a service account name. Provide a name relevant to your project and click on "Furnish a new private key." The default "JSON" Key type should be left selected.

Getting started with Red Hat OpenShift on Google Cloud Platform

Step 2 : Once you click "Create," a service account “.json” will be downloaded to your browser’s downloads location.

Important : Like any credential, this represents an access mechanism to authenticate and use resources in your GCP account  KEEP IT SAFE! Never place this file in a publicly accessible source repo (e.g., public GitHub).

Step 3 : We’ll be using the JSON credential via a Kubernetes secret deployed to your OpenShift cluster. To do so, first perform a base64 encoding of your JSON credential file:

$ base64 -i ~/path/to/downloads/credentials.json

Keep the output (a very long string) ready for use in the next step, where you’ll replace ‘BASE64_CREDENTIAL_STRING’ in the pod example (below) with the output just captured from base64 encoding.

Important : Note that base64 is encoded (not encrypted) and can be readily reversed, so this file (with the base64 string) is just as confidential as the credential file above.

Step 4 : Next you’ll create the Kubernetes secret inside your OpenShift Cluster. A secret is the proper place to make sensitive information available to pods running in your cluster (like passwords or the credentials downloaded in the previous step). This is what your pod definition will look like (e.g., google-secret.yaml):

apiVersion: v1 kind: Secret metadata:   name: google-services-secret type: Opaque data:   google-services.json: BASE64_CREDENTIAL_STRING

You’ll want to add this file to your source-control system (minus the credentials).

Replace ‘BASE64_CREDENTIAL_STRING’ with the base64 output from the prior step.

Step 5 : Deploy the secret to the cluster:

$ oc create -f google-secret.yaml

Step 6 : Now you’re in a position to use Google APIs from your OpenShift cluster. To take your GCP-enabled cluster for a spin, try going through the steps detailed in the write-up: https://cloud.google.com/solutions/real-time/kubernetes-pubsub-bigquery

You’ll need to make two minor tweaks for the solution to work on your OpenShift cluster:

  1. For any pods that need to access Google APIs, modify the pod to create a reference to the secret. The environment variable “GOOGLE_APPLICATION_CREDENTIALS” needs to be exported to the pod   more info on how they work:
    https://developers.google.com/identity/protocols/application-default-credentials#howtheywork
    In the PubSub-BiqQuery solution, that means you’ll modify two pod definitions:
    • pubsub/bigquery-controller.yaml
    • pubsub/twitter-stream.yaml

    For example:

    apiVersion: v1 kind: ReplicationController metadata:   name: bigquery-controller   labels:     name: bigquery-controller      spec:       containers:       …         env:         …          - name: GOOGLE_APPLICATION_CREDENTIALS           value: /etc/secretspath/google-services.json         volumeMounts:         - name: secrets           mountPath: /etc/secretspath           readOnly: true       volumes:       - name: secrets         secret:           secretName: google-services-secret
  2. Finally, anywhere the solution instructs you to use "kubectl," replace that with the equivalent OpenShift command "oc." 

That’s it! If you follow along with the rest of the steps in the solution , you’ll soon be able to query (and see) tweets showing up in your BigQuery table  arriving via Cloud Pub/Sub. Going forward with your own deployments, all you need do is follow the above steps of attaching the credential secret to any pod where you use Google Cloud SDKs and/or access Google APIs.

Join us at GCP Next!

If you’re attending GCP Next and want to experience a live ‘hands-on’ walk-thru of this and other solutions, please join us at the Red Hat OpenShift Workshop . Hope to see you there! If not, don’t miss all the Next sessions online .

Posted by Sami Zuhuruddin, Solutions Architect, Google Cloud Platform

转载本站任何文章请注明:转载至神刀安全网,谢谢神刀安全网 » Getting started with Red Hat OpenShift on Google Cloud Platform

分享到:更多 ()

评论 抢沙发

  • 昵称 (必填)
  • 邮箱 (必填)
  • 网址
分享按钮