-
Notifications
You must be signed in to change notification settings - Fork 8.6k
Closed
Labels
Description
ehi 👋,
I'm going to create a service by an AWS Lambda as Docker Image.
Even though I've putted rec_model_dir
, det_model_dir
and cls_model_dir
to a local path with models already there,
every run it still download ch_PP-OCRv3_det_infer.tar
Code:
`
import sys
from paddleocr import PaddleOCR,draw_ocr
def handler(event, context):
ocr = PaddleOCR(ocr = PaddleOCR(det=False, det_model_dir='/tmp/whl/det/en/en_PP-OCRv3_det_infer/', rec_model_dir='/tmp/whl/rec/en/en_PP-OCRv3_det_infer/', cls_model_dir='/tmp/whl/cls/en/en_PP-OCRv3_det_infer/', use_angle_cls=False, use_gpu=False, lang='en'))
return 'Hello from AWS Lambda using Python' + sys.version + '!'
`
Log:
download https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar to /home/sbx_user1051/.paddleocr/whl/det/ch/ch_PP-OCRv3_det_infer/ch_PP-OCRv3_det_infer.tar
How can I avoid that?
Thank you