Getting Started

Sections

Theme switcher

How to follow trends of passages through Suez Canal?

Are you willing to track the trends of passages through the Suez Canal to identify changing trading patterns?

This code captures passages through the Canal of Suez (polygon type = canal) for all drybulk, tankers, gas and container vessels.

The result enables you to analyze and visualize easily on a graph (see screenshot) the trend. You should be able to easily spot the impact of the start of the conflict arising in the south of the Red Sea and the traffic going down through the Suez Canal after it started.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 import requests import pandas as pd token = "INSERT MY TOKEN" # Enter your personal token provided by AXSMarine to use AXSMarine Data canal_name = 'Suez Canal' # This is the canal name for which you’ll like to analyze the traffic so in this case the Suez Canal. You could also look alternatively for ‘Panama Canal’ or ‘Kiel Canal’. # For all API calls to AXSMarine’s Data, the token is required in the Authorization header headers = { "Authorization": "Bearer {}".format(token) } # ----------------------------------------------------------------------------------# # 1. Using the location API # # You can search the id for this particular canal. # # ----------------------------------------------------------------------------------# url = 'https://apihub.axsmarine.com/dry/locations/canals/v1' response = requests.get(url, headers=headers) response.raise_for_status() canal_list = pd.json_normalize(response.json()) canal = canal_list[canal_list['name'] == canal_name] if canal.empty: print(f"{canal_name} not found. Available list: {canal_list['name'].tolist()}") exit(1) # ----------------------------------------------------------------------------------# # 2. Using the global events api # # You can fetch all events matching to the canal selected, # # for all vessels tracked by AXSMarine for all segments (Dry, Tanker, Liner) # # ----------------------------------------------------------------------------------# url = "https://apihub.axsmarine.com/global/events/v1" variables = { "pageSize": 5000, 'polygonIds': canal['id'].tolist(), 'entryDate': { 'from': '2013-01-01' }, 'duration': { 'from': 1, 'to': 30 } } query = """ query polygonEvents($pageSize: Int, $afterCursor: String, $polygonIds: [Int], $entryDate: RangeDate, $duration: RangeInt){ polygonEvents(first: $pageSize, after: $afterCursor, polygonIds: $polygonIds, entryDate: $entryDate, duration: $duration) { pageInfo { endCursor } edges { node { vessel {imo, type} entryAis {time, latitude} outAis {latitude} } } } } """ events = None while True: print(f"Call GraphQL to {url, variables} ...") response = requests.post(url, json={'query': query, 'variables': variables}, headers=headers) response.raise_for_status() json = response.json()['data']['polygonEvents'] events = pd.concat([events, pd.json_normalize([i['node'] for i in json['edges']])]) if len(json['edges']) < variables['pageSize']: break variables['afterCursor'] = json['pageInfo']['endCursor'] events.to_csv(f'{canal_name}_events.csv', index=False) events['entryAis.time'] = pd.to_datetime(events['entryAis.time']) events['entryAis.day'] = events['entryAis.time'].dt.date events[(events['entryAis.latitude'] < 30.2) & (events['outAis.latitude'] > 30.8)].groupby(['entryAis.day', 'vessel.type']).size().reset_index(name='counts').to_csv(f'{canal_name} northbound.csv', index=False) events[(events['entryAis.latitude'] > 30.8) & (events['outAis.latitude'] < 30.2)].groupby(['entryAis.day', 'vessel.type']).size().reset_index(name='counts').to_csv(f'{canal_name} southbound.csv', index=False)

POST

/

Select
1

Response