Webvar
Free Daily Stock Prices, Metadata & Candlestick Patterns | US Exchanges - logo

Free Daily Stock Prices, Metadata & Candlestick Patterns | US Exchanges

This offering includes Daily OHLCV, Metadata and 110 Japanese Candlestick Patterns and attributes for the exchanges shown below. We would like to know our customers and would appreciate any feedback that they may offer. Therefore, we will enable subscription verification for the product. This product may not remain free forever.
awsPurchase this listing from Webvar in AWS Marketplace using your AWS account. In AWS Marketplace, you can quickly launch pre-configured software with just a few clicks. AWS handles billing and payments, and charges on your AWS bill.

About

**Exchanges**

 

* BATS Exchange

* NASDAQ Stock Exchange

* NEW YORK Stock Exchange

* NYSE ARCA Exchange

* NYSE MKT Exchange (NYSE American)

* OTCBB, OTCCE, OTCGREY, OTCMKTS, OTCQB, OTCQX

* PINK Sheets

 

## Example

Description | Value

----|-----

Exchange | NASDAQ

Ticker | MSFT

Name | Microsoft Corporation

Country | USA

Currency | USD

Asset_Type | Common Stock

ISIN | US5949181045

Date | 2022-10-31

Open | 233.760000

High | 234.920000

Low | 231.150000

Close | 232.130000

Harami_Bearish | 1

CreateDate | 2022-12-24 20:36:00

* ISIN values may not exist.

* Data is not guaranteed to be 100% free of gaps. There may be some days where we do not have the OHLCV for the asset.

* This product contains a file-based dataset.

* For the file-based datasets, you may setup automatic exports of the csv data to your own S3 buckets. You will then receive each daily revision into your own bucket. Each exchange will have its own daily (or intraday) csv.

* With this file based dataset and your automatic exports, you will have access to all published historical data as well.

* Data will be published daily at ~9:45 pm Eastern Time.

* Once the revisions are programmatically exported to your own buckets, you can access the csv's via S3 console or 3rd party app (e.g. Cloudberry) or you may decide to programmatically obtain the data by making an API or CLI call to the bucket.

Here are 3 examples of how to do that:

** API call via Python **

This script grabs the most recent file from the bucket, prints some rows and saves the file locally:

```

import boto3

import os

import csv

# Set up the AWS credentials and S3 client

# Assumption is that you have local AWS credentials file with access key and secret key

# Typically in C:\Users\username\.aws\credentials

# otherwise, these values need to be supplied, eg.

#s3 = boto3.client('s3',

#aws_access_key_id='YOUR_ACCESS_KEY',

#aws_secret_access_key='YOUR_SECRET_ACCESS_KEY',

#region_name='YOUR_REGION_NAME')

s3 = boto3.client('s3')

exchanges = ['us']

for exchange in exchanges:

# Define the bucket name for the exchange

bucket_name = '{}-candles'.format(exchange.lower())

# List all objects in the specified S3 bucket

response = s3.list_objects_v2(Bucket=bucket_name)

# Sort the objects by the last modified date and get the most recent one

objects = sorted(response['Contents'], key=lambda obj: obj['LastModified'], reverse=True)

if not objects:

continue

most_recent_object = objects[0]

# Download the most recent CSV file to the local filesystem - make sure folder exists

file_name = most_recent_object['Key']

local_file_path = r'C:\users\bob\downloads\{}\{}'.format(exchange, os.path.basename(file_name))

s3.download_file(bucket_name, file_name, local_file_path)

# Read the CSV file and print the top 100 rows

with open(local_file_path, 'r') as file:

csv_reader = csv.reader(file)

for row_num, row in enumerate(csv_reader, 1):

print(row)

if row_num == 100:

break

print('CSV file for exchange {} has been printed successfully.'.format(exchange.upper()))

# Optionally, you can move the file to a different location or delete it

# os.rename(local_file_path, '/path/to/destination/folder/new_filename.csv')

# os.remove(local_file_path)

```

** Another API call via Python: **

 

This script reads the most recent csv from the bucket, and prints the top X records as json:

```

import json

import pandas as pd

import boto3

# Set up the AWS credentials and S3 client

# Assumption is that you have local AWS credentials file with access key and secret key

# Typically in C:\Users\username\.aws\credentials

# otherwise, these values need to be supplied, eg.

#s3 = boto3.client('s3',

#aws_access_key_id='YOUR_ACCESS_KEY',

#aws_secret_access_key='YOUR_SECRET_ACCESS_KEY',

#region_name='YOUR_REGION_NAME')

s3 = boto3.client('s3')

object_list = []

bucket_name = 'us-candles'

paginator = s3.get_paginator('list_objects_v2')

page_iterator = paginator.paginate(Bucket=bucket_name)

for result in page_iterator:

object_list += filter(lambda obj: obj['Key'].endswith('.csv'), result['Contents'])

object_list.sort(key=lambda x: x['LastModified'])

A = (object_list[-1]['Key'])

full_path = f's3://{bucket_name}/{A}'

# Create an iterator to read the CSV in chunks

chunk_size = 1000 # Adjust this based on your needs. Refers to rows.

csv_iterator = pd.read_csv(full_path, chunksize=chunk_size)

# Initialize an empty list to store the JSON records

json_records = []

for chunk in csv_iterator:

# Convert each chunk to a list of dictionaries (JSON records)

records = chunk.to_dict(orient='records')

json_records.extend(records)

# Slice the list to get the top x records. To get 1 record, use json_records[:1]. Each record will have ~125 rows.

top_x_records = json_records[:2]

# Serialize the list of JSON records to JSON

final_json = json.dumps(top_x_records, indent=4, separators=(',', ': '))

print(final_json)

```

** AWS CLI call: **

 

Run at elevated command prompt or use batch file. This script grabs the most recent file from the bucket and saves the file locally:

```

:: Assumption is that you have local AWS credentials file in your path with access key and secret key

:: Typically in C:\Users\username\.aws\credentials

@echo off

setlocal enabledelayedexpansion

set 'exchanges=us'

for %%x in (%exchanges%) do (

set 'exchange=%%x'

set 'bucket_name=!exchange!-candles'

set 'download_location=C:\temp\!exchange!'

for /f 'usebackq tokens=*' %%o in (`aws s3api list-objects-v2 --bucket !bucket_name! --query 'sort_by(Contents, &LastModified)[-1].Key' --output text`) do (

set 'file_name=%%o'

if not defined file_name (

continue

)

if not exist '!download_location!' (

mkdir '!download_location!'

)

call :download_file '!bucket_name!' '!file_name!' '!download_location!'

)

)

exit /b

:download_file

aws s3 cp s3://%1/%2 %3

exit /b

```

---

## All Candlestick Patterns

* Advance_Block

* Abandoned_Baby_Bearish

* Abandoned_Baby_Bullish

* Belt_Hold_Line_Bearish

* Belt_HoldLine_Bullish

* Body_Gap_Down

* Body_Gap_Up

* Body_Midpoint

* Breakaway_Bear

* Breakaway_Bull

* Candlestick_Bottom_Shadow_Size

* Candlestick_Color

* Candlestick_Shape

* Candlestick_Top_Shadow_Size

* Closing_Marubozu_Black

* Closing_Marubozu_White

* Closing_Point_Reversal_Formation_Top

* Closing_Point_Reversal_Formation_Bottom

* Concealing_Baby_Swallow

* Counter_Attack_Line_Bearish

* Counter_Attack_Line_Bullish

* Create_Date

* Dark_Cloud_Cover

* Deliberation_Bear

* Deliberation_Bull

* Doji

* Downside_Gap_Three_Methods

* Dragonfly_Doji_Bear

* Dragonfly_Doji_Bull

* Engulfing_Line_Bearish

* Engulfing_Line_Bullish

* Evening_Doji_Star

* Evening_Star

* Falling_Three_Method

* Gap_Formation_Up

* Gap_Formation_Down

* Gravestone_Doji

* Hammer

* Hanging_Man

* Harami_Bullish

* Harami_Cross_Bullish

* Harami_Cross_Bearish

* Harami_Bearish

* High_Wave_White

* High_Wave_Black

* Homing_Pigeon

* Identical_Three_Crows

* In_Neck_Line

* Inside_Bar_Formation

* Inverted_Hammer

* Island_Reversal_Formation_Top

* Island_Reversal_Formation_Bottom

* Key_Reversal_Formation_Top

* Key_Reversal_Formation_Bottom

* Kicking_Down

* Kicking_Up

* Ladder_Top

* Ladder_Bottom

* Long_Legged_Doji

* Marubozu_Black_Downtrend

* Marubozu_White_Downtrend

* Marubozu_Black_Uptrend

* Marubozu_White_Uptrend

* Matching_Low

* Mat_Hold_Pattern

* Meeting_Lines_Bear

* Meeting_Lines_Bull

* Morning_Doji_Star

* Morning_Star

* OnNeck

* Outside_Bar_Formation

* Piercing_Line

* Rickshaw_Man

* Rising_Three_Method

* Separating_Line_Bearish

* Separating_Line_Bullish

* Shooting_Star

* Short_Line_Black

* Short_Line_White

* Side_By_Side_White_Gapping_Down

* Side_By_Side_White_Gapping_Up

* Spinning_Top_Black

* Spinning_Top_White

* Stalled_Pattern

* Stick_Sandwich_Bullish

* Stick_Sandwich_Bearish

* Takuri

* Tasuki_Downside_Gap

* Tasuki_Upside_Gap

* Three_Black_Crows

* Three_Gaps_Down

* Three_Gaps_Up

* Three_Inside_Down

* Three_Inside_Up

* Three_Line_Strike_Bear

* Three_Line_Strike_Bull

* Three_White_Soldiers

* Three_Outside_Down

* Three_Outside_Up

* Three_Stars_In_The_South

* Thrusting_Line

* Tri_Star_Bear

* Tri_Star_Bull

* Tweezers_Bottom

* Tweezers_Top

* Turn_Formation_Upturn

* Turn_Formation_Downturn

* Two_Crows

* Unique_Three_River

* Upside_Gap_Two_Crows

* Upside_Gap_Three_Methods

* Western_Gap_Down

* Western_Gap_Up

 

---

## Additional Information

* [Our Candlestick Patterns in Detail] (https://www.olaptrader.com/our-candles-described/)

---

## Pricing Information

Purchasing this data set entitles subscribers to all future revisions.

---

## Subscription Verification Information

This product requires subscription verification

---

## Need Help?

* If you have questions about or comments, contact us [HERE] (mailto:mail@olaptrader.com)

---

## About OLAPTrader

* [www.olaptrader.com] (https://www.olaptrader.com)

 

OLAPTrader was created for the purpose of compiling and analyzing technical indicator data using multi-dimensional analysis - also called online analytical processing or OLAP. Using our own home grown application which leverages third party financial functions and open source libraries, we compile over 200 technical indicators (250 with our variations), 15 Fibonacci Retracements/Projections, 17 Advance/Decline Indicators, and 100 Japanese Candlestick Patterns for over 20 US and international exchanges. We feel that when you understand the power of using mutually complimentary indicators and using dimensional analysis and historical data to find the best combinations, you give yourself a tremendous edge over other investors.

Related Products

How it works?

Search

Search 25000+ products and services vetted by AWS.

Request private offer

Our team will send you an offer link to view.

Purchase

Accept the offer in your AWS account, and start using the software.

Manage

All your transactions will be consolidated into one bill in AWS.

Create Your Marketplace with Webvar!

Launch your marketplace effortlessly with our solutions. Optimize sales processes and expand your reach with our platform.