Importing a file into jupyterlabs from s3

Matt123

I have a file I want to import into a Sagemaker Jupyter notebook python 3 instance for use. The exact code would be 'import lstm.' I can store the file in s3 (which would probably be ideal) or locally, whichever you prefer. I have been searching the internet for a while and have been unable to find a solution to this. I am actually just trying to run/understand this code from Suraj Raval's youtube channel: https://github.com/llSourcell/Bitcoin_Trading_Bot. The 'import lstm' line is failing when I run, and I am trying to figure out how to make this work.

I have tried: from s3://... import lstm. failed I have tried some boto3 methods and wasn't able to get it to work.

import time
import threading
import lstm, etl, json. ##this line
import numpy as np
import pandas as pd
import h5py
import matplotlib.pyplot as plt
configs = json.loads(open('configs.json').read())
tstart = time.time()

I would just like to be able to import the lstm file and all the others into a Jupyter notebook instance.

raj

I think you should be cloning the Github repo in SageMaker instance and not importing the files from S3. I was able to reproduce the Bitcoin Trading Bot notebook from SageMaker by cloning it. You can follow the below steps

Cloning Github Repo to SageMaker Notebook

  1. Open JupyterLab from the AWS SageMaker console.
  2. From the JupyterLab Launcher, open the Terminal.
  3. Change directory to SageMaker
cd ~/SageMaker
  1. Clone the BitCoin Trading Bot git repo
git clone https://github.com/llSourcell/Bitcoin_Trading_Bot.git
cd Bitcoin_Trading_Bot
  1. Now you can open the notebook Bitcoin LSTM Prediction.ipynb and select the Tensorflow Kernel to run the notebook.

enter image description here

Adding files from local machine to SageMaker Notebook

To add files from your local machine to SageMaker Notebook instance, you can use file upload functionality in JupyterLab

Adding files from S3 to SageMaker Notebook

To add files from S3 to SageMaker Notebook instance, use AWS CLI or Python SDK to upload/download files.

For example, to download lstm.py file from S3 to SageMaker using AWS CLI

aws s3 cp s3://mybucket/bot/src/lstm.py .

Using boto3 API

import boto3
s3 = boto3.resource('s3')
s3.meta.client.download_file('mybucket', 'bot/src/lstm.py', './lstm.py')

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

From Java

Importing class from another file

From Java

Importing variables from another file?

From Java

How to move a file from one S3 Client to another S3 client

From

Create a zip file on S3 from files on S3 using Lambda Node

From Dev

Importing Scriptblock from file

From Dev

Importing text file : No Columns to parse from file

From Dev

Upload a file stream to S3 without a file and from memory

From Dev

query from athena a file in s3

From Dev

Importing Large Size of Zipped JSON File from Amazon S3 into AWS RDS-PostgreSQL Using Python

From Dev

Importing a file from a function in another file

From Dev

Approach to move file from s3 to s3 glacier

From Dev

Importing a table from an ASCII file

From Dev

Importing libraries in AWS Lambda function code from S3 bucket

From Dev

Boto3 : Download file from S3

From Dev

How to solve s3 access denied on file added to s3 bucket from another account?

From Dev

Copy file from the s3 to the local?

From Dev

Read .pptx file from s3

From Dev

How to read file from s3?

From Dev

copy file from gcs to s3 in boto3

From Dev

Delete a file from s3 bucket

From Dev

To push the file from lambda to s3

From Dev

Uploading a file from memory to S3 with Boto3

From Dev

AWS Glue job to unzip a file from S3 and write it back to S3

From Dev

Importing function from other file in clojure

From Dev

importing from a Json file

From Dev

importing from a different folder but on the same level as the file importing from in Python

From Dev

Access Zip File from S3

From Dev

Zip File from S3 files

From Dev

list one file (any file) from S3