Apigee

Apigee /API Gateway

Apigee Edge is a platform for developing and managing APIs. By fronting services with a proxy layer. Apigee provides security, threat protection, spike arrest, rate limiting, quotas, analytics, load balancing and more.

Apigee consists of two message processors, we need to handle carefully while doing quota policy.

On the Edge portal, publishers and product owners will be able to:

1.     Develop and publish API proxies

2.     Package API proxies into API products

3.     Publish API products

4.     Manage consumer subscriptions to API products

5.     Monitor API proxy traffic/errors/etc.

The following image shows client apps are communicating to backend services without Apigee




Because providers make their services available over the web, they must ensure that they have taken all necessary steps to secure and protect their services from unauthorized access. As a service provider, consider:

·        Security: How will you control access to your services to prevent unauthorized access?

·        Compatibility: Will your services work across different platforms and devices?

·        Measurability: How can you monitor your services to make sure they are available?

·        Monetization: How can you track and bill customers for access to your services?

·        And many other considerations

 

Make services available through Apigee Edge

Apigee Edge enables you to provide secure access to your services with a well-defined

API that is consistent across all of your services, regardless of service implementation.

·        Makes it easy for app developers to consume your services.

·        Enables you to change the backend service implementation without affecting the public API.

·        Enables you to take advantage of the analytics, monetization, developer portal, and other features built into Edge.

The following image shows an architecture with Apigee Edge handling the requests from client apps to your backend services:


Because app developers make HTTP/HTTPS requests to an API proxy, rather than directly to your services, developers do not need to know anything about the implementation of your services. All the developer needs to know is:

·        The URL of the API proxy endpoint.

·        Any query parameters, headers, or body parameters passed in a request.

·        Any required authentication and authorization credentials.

·        The format of the response, including the response data format, such as XML or JSON.

The API proxy isolates the app developer from your backend service. Therefore, you are free to change the service implementation as long as the public API remains consistent. For example, you can change a database implementation, move your services to a new host, or make any other changes to the service implementation. By maintaining a consistent frontend API, existing client apps will continue to work regardless of changes on the backend.

You can use policies on the API proxy to add functionality to a service without having to make any changes to the backend service. For example, you can add policies to your proxy to perform data transformations and filtering, add security, execute conditional logic or custom code, and to perform many other actions. The important thing to remember is you implement policies on Edge, not on your backend server.

Apigee edge UI



 Organizations  

An organization is the top-level container in Apigee Edge. It contains all your API proxies and related resources. We can have organizations like, dev test, Staging, PROD…. While the rest of this topic goes into more depth about organizations, here are a few practical points:

 

API Proxies  

The API proxy decouples your backend service implementation from the API that developers consume. This shields developers from future changes to your backend services. As you update backend services, developers, insulated from those changes, can continue to call the API uninterrupted.

Create the API proxy  

The easiest way to create an API proxy is using the Create Proxy wizard.  

 


 


 



  


  Using Trace option, we can check the flow of the proxy at run time.

 

 

ProxyEndpoint and TargetEndpoint:A route determines the path of a request from the ProxyEndpoint to the TargetEndpoint. We can include steps in PreFlow in the Proxy Endpoint and Target Endpoint

 


 Here are the some of the policies:



 AssignMessage policy:

The Assign Message policy changes or creates new request and response messages during the API proxy Flow. The policy lets you perform the following actions on those messages:

·        Add new form parameters, headers, or query parameters to a message

·        Copy existing properties from one message to another

·        Remove headers, query parameters, form parameters, and/or message payloads from a message

·        Set the value of existing properties in a message, if not present it will create.

With Assign Message, you typically add, change, or remove properties of either the request or response. However, you can also use Assign Message to create a custom request or response message and pass it to an alternative target

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>

<KeyValueMapOperations async="false" continueOnError="false" enabled="true" name="Assign-usercredentials" mapIdentifier="user-credentialskvm">

    <DisplayName>Assign user credentials</DisplayName>

    <Properties/>

    <Get assignTo="private.user">

        <Key>

            <Parameter>USER_ID</Parameter>

        </Key>

    </Get>

    <Get assignTo="private.pwd">

        <Key>

            <Parameter>USER_PASSWORD</Parameter>

        </Key>

    </Get>

    <Scope>environment</Scope>

</KeyValueMapOperations>

 

SpikeArrest policy:

The Spike Arrest policy protects against traffic spikes with the <Rate> element. This element throttles the number of requests processed by an API proxy and sent to a backend, protecting against performance lags and downtime.

 

<SpikeArrest async="false" continueOnError="false" enabled="true" name="Spike-Arrest-1">
     <DisplayName>Spile-Arrest-Policy1</DisplayName>
     <Rate>30ps</Rate>
     <Identifier ref="request.header.some-header-name"/>
     <MessageWeight ref="request.header.weight"/>
     <UseEffectiveCount>true</UseEffectiveCount>
</SpikeArrest>

ExtractVariables policy

The Extract Variables policy extracts content from a request or response and sets the value of a variable to that content. You can extract any part of the message, including headers, URI paths, JSON/XML payloads, form parameters, and query parameters. The policy works by applying a text pattern to the message content and, upon finding a match, sets a variable with the specified message content

 

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>

<ExtractVariables async="false" continueOnError="false" enabled="true" name="Extract-Variables-apikey">

    <Source>request</Source>

    <IgnoreUnresolvedVariables>false</IgnoreUnresolvedVariables>

    <XMLPayload>

        <Namespaces>

            <Namespace prefix="soapenv">http://schemas.xmlsoap.org/soap/envelope/</Namespace>

            <Namespace prefix="wsse">http://schemas.xmlsoap.org/ws/2002/07/secext</Namespace>

        </Namespaces>

        <Variable name="apikey" type="string">            <XPath>/soapenv:Envelope/soapenv:Header/wsse:Security/wsse:ApikeyToken/wsse:apikey/text()</XPath>

        </Variable>

    </XMLPayload>

    <VariablePrefix>soapHeaderAPIKey</VariablePrefix>

</ExtractVariables>

 

Attaching and configuring policies in the UI  

Adding policy-based capabilities to an API proxy is a two-step process:

1.     Configure an instance of a policy type.

2.     Attach the policy instance to a Flow.

The diagram below shows the relationship between policies and Flows. As you can see, a policy is attached to a Flow as a processing "Step". To configure the desired behavior for your API, you need to understand a little bit about Flows.



One type of policy that is commonly used is SpikeArrest. SpikeArrest prevents sudden increases in message traffic that might swamp your backend services.

  



 





 


APIGEE Provider/subscriber flow


 


 Basic API key validation  

ProxyEndpoint Request Flow:

1.     SpikeArrest

2.     XMLThreatProtection or JSONThreatProtection

3.     API key validation

4.     Quota

5.     ResponseCache

ProxyEndpoint Response Flow:

1.     ResponseCache

 

Reports

 

We can have custom reports based on different dimensions like Developer app name, total traffic, by proxy name, by proxy error, target error ……

 


No comments:

Post a Comment