Thank you! My pleasure! Right, I see. Yes, Heroku has a few specifications on their side. I'm glad that you managed in the end, though, with it. No worries!
I've done literally everything and even double checked my settings with GPT. I can't for the life of me get this to work and I've been at it all day. It keeps defaulting to static, it's like it's not even trying to push to S3 even though all the credentials are set up and everything, storages is in the middleware, whitenoise is commented out, and static files storage is boto. I have no idea what to do. It should at least throw an error if something is wrong with permmissions, or there should be a way to see if there was even a request made to s3. Oh. My. Gosh. It's been 24 hours, possibly 6-8 hours at just this alone.
Hi there. I'd suggest taking a look at the source code provided and using that as a baseline. I've re-tested the process and everything works as it should. Please ensure that you double-check your AWS_STORAGE_BUCKET_NAME, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and AWS_S3_REGION_NAME are set correctly. That you haven't changed their names and that there are no additional spaces. Look for any potential conflicts or overriding of your S3 settings, especially STATIC_URL and STATIC_ROOT. Ensure whitenoise is completely removed and not somehow reactivated elsewhere in the config. Additionally, please ensure that you have no network restrictions on firewalls preventing communication between Django and Amazon S3 via the CLI. Good luck!
Thank you for this video. It’s been a great help! My images, CSS, and JavaScript are all working, but the icons are not showing on my website. Could you help me with this?
My pleasure! I'm glad that it helped. With icons, I'd suggest that you use font awesome icons, where you simply add in the cdn and then the icon url directly. That'll help solve icon issues.
My pleasure! Yes, EC2 instances come with storage. They are grouped into two types. Namely: Instance Store Volumes (Ephemeral Storage) and Amazon Elastic Block Store (EBS):
Hi. This process works in both environments. The only thing that you need to set in production environments is your environment variables. As soon as that's configured it will work anywhere. I'd suggest double checking the process.
How do i connect this to my django website which is deployed using vercel by watching ur other video? Since my css and js doesn't work there and I am getting an error for pushing the modified settings.py file to my github, even i have set the .env file too
Hi, No worries! It would be easier to follow these steps before deployment. It could be done with an app already deployed although, it might be tricky to handle from that approach
Does getObject handle post request too.. I mean, form submission kind of work or adding users in our website. In short, Read/Write both permission works through GetObject in s3 bucket?
Thank you for the great tutorial! How can I use a role instead of a IAM user? It’s generally considered a bad practice to have users and IAM access keys that we use to access AWS sources, because those keys can easily become compromised. Instead, the general practice is to create an IAM role that the application uses, which has permissions to access the things it needs (S3 bucket, etc…). The application running in Fargate can then “assume” that IAM role to be granted access to what it needs to. Until now I used a IAM user and access key too, but want to switch to role. I thought it would be straight forward, but it wasn't. Do you happen to know more about this approach? No worries if you don't, you did a great job for whoever needs to use this IAM user approach!
Of course, my pleasure! Thank you for your message. To use a role instead of an IAM user, you can create an IAM role with the necessary permissions, assign it to your compute service (like EC2 or ECS), and the service will automatically assume the role to access resources like S3 without needing IAM user keys. For example, if you want an EC2 instance to access an S3 bucket, you would create a role with a policy granting s3:GetObject or s3:PutObject permissions. After the role is created, it is assigned to the EC2 instance or another AWS service that needs the permissions. When the instance or service needs to perform actions, it automatically assumes the role and temporarily gets the permissions specified in the role's policy. Of course. You will need to dig deeper on your own preferences. Regarding the IAM role vs IAM user approach, and from a best practice standpoint, you’re right. IAM roles are more secure, and they are indeed the preferred option over using IAM user access keys. This video assumes you're using Amazon S3 as a standalone service, focusing only on S3 integration without utilising other AWS compute services involved. Hence the reason why we don't use IAM roles, since they cannot be used with sole native integration. But yes, if your application is fully within AWS, using IAM roles is definitely the better approach.
Hi, Unfortunately, there are no free services in AWS. You will pay a fee. Elastic Beanstalk is for deployment and S3 is for handling objects (file management). The services are different.
I had configured all my code according to your but still facing an error the module not found: storages, and I have confirmed that the module is correctly installed in my env. Can you help me in this situation?
Hi. Kindly refer to the source code in the github link (in the description). This code has been tried and tested all you need to do is install the necessary packages, follow the steps and you will be fine. I'd suggest using that as a baseline first for your first application. Good luck!
Hi, Yes, it is necessary so that we can ensure that our bucket is public and the API action that we can perform upon our objects. Sure, you can upload objects without the policy, but you still need to be able to access it. This is why we use the policy.
Hi. Yes, that's right and how we go about it in this video, and under the assumption that they are like profile pictures and for everyone to see which everyone should see. If you only want staff users to access it, you will need to integrate additional settings to your JSON policy on S3
Hi, This won't work because the access keys are sensitive data and it's dangerous to upload a project with aws access leys exposed. It should be within environment variables.
I am using Django 5.0, django-storages 1.14.2 and boto3 1.34.73. I have been trying all weekend to get this app to work with Backblaze B2 or Digital Ocean Spaces. Both are said to be S3 compatable. I am running into the same issue with both. My /static/ directory does not get picked up and loaded into my storage bucket upon calling collectstatic. Everything pertaining to my admin and all of the other apps in the project that have static files get swept up and loaded into my storage bucket and the admin works fine...but no static files or media files for my front end. I have no idea if it is django-storages or Django or something else. My logs look fine except for the fact that they do not show my /static/ directory being swept up for my remote bucket. Everything works fine in dev environment when serving from the file system.
Hi, to troubleshoot the issue with your Django application not loading static files into your remote storage bucket: First, ensure that your Django settings are correctly configured to use the remote storage backend for static files. Check your STATIC_URL, STATIC_ROOT, and STATICFILES_STORAGE settings in your settings.py file. Verify that your application has the necessary permissions to write to the remote storage backend, including both reading and writing permissions for static files. Check the bucket settings in Backblaze B2 or Digital Ocean Spaces to confirm that they allow access and write permissions for your Django application. Test connectivity by manually uploading a static file to your remote storage backend using the provider's tools. This can help confirm that your application can connect to the storage backend successfully. Review Django logs for any error messages or warnings related to static file handling. This can provide clues to the underlying issue. By following these steps, you should be able to identify and resolve the issue with loading static files into your remote storage bucket. As a final resort, I'd suggest downloading and utilising the final project source code that was used for this video. It is available in the description (for clarity). Good luck!
@@gofooddy5472 no I wasn't able to make it work. I resorted to storing the images to hard disk for now. For this particular app it works okay because its low traffic etc. Will need to revisit this issue later for sure.
Hi, As soon as you've completed Amazon S3 for static and media files. It should work fine on PythonAnywhere. You need to do S3 first and then PythonAnywhere.
@@CloudWithDjango As check I was using ck editor for text body so it's consuming more time and files.... Post removal of ck editor it's working smoothly... .Thank you for contribution.... Keep sharing
i have creates AWS account, websites ask me about my credit card details , and i choose free account tier out of 3 options. i need to know is this process free or i need to pay ?
Guys, I have a problem, it worked properly, but I'm using tailwind with django, and that's why when I activate the bucket my css file also goes to s3, and is not reloaded on the page, I already deleted my tailwind css from the bucket but it didn't solve it, I only solve it by removing amazon s3 from the settings. Does anyone know how to solve it?
Hi, Please ensure your static files are correctly configured to be served from Amazon S3. Check STATIC_URL and STATICFILES_STORAGE settings. Verify CSS files generated by Tailwind are collected and stored in your static files directory, then uploaded to S3. If CSS files aren't updating, check caching settings and invalidate cache when deploying changes. If issues persist, consult documentation or forums for specific guidance. Good luck!
Hi, Please ensure that you have no syntax errors with the variable declaration and that you have valid access keys that you are using. This is the main reasons for error.
@@CloudWithDjango Thanks for the reply. I have another question. Django Rest Framework API and Nextjs combination Is this process the same or different?
My pleasure! The process of combining Django Rest Framework (DRF) API with Next.js is similar to integrating any frontend framework with a backend API. You would use Next.js to create the frontend, and DRF to develop the backend API. The communication between them typically involves making HTTP requests from Next.js to the DRF endpoints. However, the specific implementation details may vary based on your project requirements and architecture choices. Ensure proper CORS (Cross-Origin Resource Sharing) settings, handle authentication, and manage state as needed in your Next.js application.
Hi, I think there is a misunderstanding. In web development static files need to be public as do images if it's publically for all. This is not a deployment video, so you'd need to watch a tutorial on deployment so people can go online and use your app
Thank you for your response , it’s my first time hosting a website or webApp, now my bucket got created in [us-1] and thats not letting me use the collectstatic
hi why am i getting this error when doing 'manage.py collectstatic'. ..... raise ImproperlyConfigured("Could not load Boto3's S3 bindings. %s" % e) django.core.exceptions.ImproperlyConfigured: Could not load Boto3's S3 bindings. No module named 'six.moves' ...... i do have six package installed
Great guidance, But I am watched the video 10 times 🥲🥲 But i am still getting this error An error occurred (AccessControlListNotSupported) when calling the PutObject operation: The bucket does not allow ACLs. This is the policy included in my Bucket { "Version": "2012-10-17", "Id": "Policy1700*******99", "Statement": [ { "Sid": "Stmt170*******49", "Effect": "Allow", "Principal": "*", "Action": "s3:GetObject", "Resource": "arn:aws:s3:::ump-ai-tutor-v1/*" } ] }
Hi, Thank you for your feedback! :) This error is occurring because you have different needs and expectations with your application regarding the use of a PutObject. You can try to check the following and add the PutObject API action in your JSON policies as such: { "Version": "2012-10-17", "Id": "Policy1700*******99", "Statement": [ { "Sid": "Stmt170*******49", "Effect": "Allow", "Principal": "*", "Action": [ "s3:GetObject", "s3:PutObject" // Add this line to allow the PutObject action ], "Resource": "arn:aws:s3:::ump-ai-tutor-v1/*" } ] }
I HIGHLY RECOMMEND this video. I just followed the video, followed the steps and it's working perfectly well. Big THANK YOU sir.
Thank you so much for your kind words. I really appreciate it! Of course, it's my pleasure! :)
Great video! Clearly showed every step and easy to understand. Thank You :)
Thank you so much! I'm glad that you found it useful!
You did it, Buddy!!!! Thanks for hard work and clear presentation of the ENTIRE process.
Thank you so much! Of course, my pleasure! I'm glad that you enjoyed the video!
Thank you for the video. It took my whole day to solve it and finally with your video I did it
My pleasure! I'm very glad to hear that you managed to come right with what you were trying to solve! All the best to you!
new subscriber because YOU ARE AWESOME!!! keep making these amazing tutorials!
Thank you so much for the support!
Awesome, thank you very much for this detailed AWS video!
My pleasure!
N you make a video for that package?
Django APScheduler
Thank you for the suggestion
You are awsome !!! Thanks for this wonderful tutorial, It worked straight!
Thank you so much for your kind words and feedback! Of course, my pleasure! I'm glad that it worked!
Lifesaver, thank you so much!
Of course, my pleasure! I'm glad that it helped you!
very good! Thank you! I had issues deploying to heroku so i had to change the ALC configs. Thanks
Thank you! My pleasure! Right, I see. Yes, Heroku has a few specifications on their side. I'm glad that you managed in the end, though, with it. No worries!
I've done literally everything and even double checked my settings with GPT. I can't for the life of me get this to work and I've been at it all day. It keeps defaulting to static, it's like it's not even trying to push to S3 even though all the credentials are set up and everything, storages is in the middleware, whitenoise is commented out, and static files storage is boto. I have no idea what to do. It should at least throw an error if something is wrong with permmissions, or there should be a way to see if there was even a request made to s3. Oh. My. Gosh. It's been 24 hours, possibly 6-8 hours at just this alone.
Hi there. I'd suggest taking a look at the source code provided and using that as a baseline. I've re-tested the process and everything works as it should. Please ensure that you double-check your AWS_STORAGE_BUCKET_NAME, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and AWS_S3_REGION_NAME are set correctly. That you haven't changed their names and that there are no additional spaces. Look for any potential conflicts or overriding of your S3 settings, especially STATIC_URL and STATIC_ROOT. Ensure whitenoise is completely removed and not somehow reactivated elsewhere in the config. Additionally, please ensure that you have no network restrictions on firewalls preventing communication between Django and Amazon S3 via the CLI.
Good luck!
Thank you for this video. It’s been a great help! My images, CSS, and JavaScript are all working, but the icons are not showing on my website. Could you help me with this?
My pleasure! I'm glad that it helped. With icons, I'd suggest that you use font awesome icons, where you simply add in the cdn and then the icon url directly. That'll help solve icon issues.
the last minutes are treasure
Thank you so much!
@@CloudWithDjango keep up brother allah bless your work
Thank you so much!!
Thank you so much for this tutorials. I have a question please
Does EC2 instances come with storage?
My pleasure!
Yes, EC2 instances come with storage. They are grouped into two types. Namely:
Instance Store Volumes (Ephemeral Storage) and Amazon Elastic Block Store (EBS):
@@CloudWithDjango Thank you
My pleasure!
Everything works in local environment but doesn't work in production (ec2 deployed)
Hi. This process works in both environments. The only thing that you need to set in production environments is your environment variables. As soon as that's configured it will work anywhere.
I'd suggest double checking the process.
How do i connect this to my django website which is deployed using vercel by watching ur other video? Since my css and js doesn't work there and I am getting an error for pushing the modified settings.py file to my github, even i have set the .env file too
Hi. I'd suggest double checking your application before deployment to check that everything is working as it should.
Hi, thanks for the tutorial but should I follow the same steps if I have a deployed app in EC2?
Hi,
No worries!
It would be easier to follow these steps before deployment. It could be done with an app already deployed although, it might be tricky to handle from that approach
@@CloudWithDjangothanks, a few changes but it worked
No worries! Glad that you managed in the end!
Does getObject handle post request too.. I mean,
form submission kind of work or adding users in our website.
In short,
Read/Write both permission works through GetObject in s3 bucket?
Thank you for the great tutorial!
How can I use a role instead of a IAM user?
It’s generally considered a bad practice to have users and IAM access keys that we use to access AWS sources, because those keys can easily become compromised. Instead, the general practice is to create an IAM role that the application uses, which has permissions to access the things it needs (S3 bucket, etc…). The application running in Fargate can then “assume” that IAM role to be granted access to what it needs to.
Until now I used a IAM user and access key too, but want to switch to role. I thought it would be straight forward, but it wasn't. Do you happen to know more about this approach?
No worries if you don't, you did a great job for whoever needs to use this IAM user approach!
Of course, my pleasure! Thank you for your message.
To use a role instead of an IAM user, you can create an IAM role with the necessary permissions, assign it to your compute service (like EC2 or ECS), and the service will automatically assume the role to access resources like S3 without needing IAM user keys. For example, if you want an EC2 instance to access an S3 bucket, you would create a role with a policy granting s3:GetObject or s3:PutObject permissions. After the role is created, it is assigned to the EC2 instance or another AWS service that needs the permissions. When the instance or service needs to perform actions, it automatically assumes the role and temporarily gets the permissions specified in the role's policy. Of course. You will need to dig deeper on your own preferences.
Regarding the IAM role vs IAM user approach, and from a best practice standpoint, you’re right. IAM roles are more secure, and they are indeed the preferred option over using IAM user access keys.
This video assumes you're using Amazon S3 as a standalone service, focusing only on S3 integration without utilising other AWS compute services involved. Hence the reason why we don't use IAM roles, since they cannot be used with sole native integration.
But yes, if your application is fully within AWS, using IAM roles is definitely the better approach.
@CloudWithDjango thank you, just noticed you answered me now! 🐾
Great video, thanks !
Thank you so much! Glad that it helped!
Simply Wowwwwwwwwwwwwwww
Thank you so much!
OMG this was amazingggggg
Thank you so much! :)
Hello Just wanted to ask a question which service should i choose in AWS.. Elastic Beanstalk or S3 ? (please recommend the free one)
Hi,
Unfortunately, there are no free services in AWS. You will pay a fee.
Elastic Beanstalk is for deployment and S3 is for handling objects (file management). The services are different.
I had configured all my code according to your but still facing an error the module not found: storages, and I have confirmed that the module is correctly installed in my env. Can you help me in this situation?
Hi. Kindly refer to the source code in the github link (in the description). This code has been tried and tested all you need to do is install the necessary packages, follow the steps and you will be fine. I'd suggest using that as a baseline first for your first application. Good luck!
Is making a policy a necessary step? Because I watched some other tutorials and they did not create a S3 bucket policy
Hi,
Yes, it is necessary so that we can ensure that our bucket is public and the API action that we can perform upon our objects. Sure, you can upload objects without the policy, but you still need to be able to access it. This is why we use the policy.
Okay, thank you so much! Excellent video btw :')
My pleasure! I'm glad that you enjoyed the video, thank you! :)
i Can't say thank you enough
Of course, my pleasure! Glad that you liked the video! :)
the media files now is public .. how to prevent that? i have some media files that only the staff users can access it !
Hi. Yes, that's right and how we go about it in this video, and under the assumption that they are like profile pictures and for everyone to see which everyone should see. If you only want staff users to access it, you will need to integrate additional settings to your JSON policy on S3
It helps but when I do the push in GitHub, does not allow me to, because of the keys 😢what can I do?
Hi,
This won't work because the access keys are sensitive data and it's dangerous to upload a project with aws access leys exposed. It should be within environment variables.
@@CloudWithDjango I might need s tutorial for that as well ❤️🩹
Sure thing! I have a few tutorials on the channel that focus on setting up environment variables
@@CloudWithDjango thank you, I'll look it up
No problem! Good luck!
Thanks so much!
My pleasure!!
you need to go through all of this just to render css and js files in your deployed django app??
Hi,
Also, media files, but yes it is a long process with Amazon S3.
I am using Django 5.0, django-storages 1.14.2 and boto3 1.34.73. I have been trying all weekend to get this app to work with Backblaze B2 or Digital Ocean Spaces. Both are said to be S3 compatable. I am running into the same issue with both. My /static/ directory does not get picked up and loaded into my storage bucket upon calling collectstatic. Everything pertaining to my admin and all of the other apps in the project that have static files get swept up and loaded into my storage bucket and the admin works fine...but no static files or media files for my front end. I have no idea if it is django-storages or Django or something else. My logs look fine except for the fact that they do not show my /static/ directory being swept up for my remote bucket. Everything works fine in dev environment when serving from the file system.
Hi, to troubleshoot the issue with your Django application not loading static files into your remote storage bucket:
First, ensure that your Django settings are correctly configured to use the remote storage backend for static files. Check your STATIC_URL, STATIC_ROOT, and STATICFILES_STORAGE settings in your settings.py file. Verify that your application has the necessary permissions to write to the remote storage backend, including both reading and writing permissions for static files. Check the bucket settings in Backblaze B2 or Digital Ocean Spaces to confirm that they allow access and write permissions for your Django application. Test connectivity by manually uploading a static file to your remote storage backend using the provider's tools. This can help confirm that your application can connect to the storage backend successfully. Review Django logs for any error messages or warnings related to static file handling. This can provide clues to the underlying issue. By following these steps, you should be able to identify and resolve the issue with loading static files into your remote storage bucket.
As a final resort, I'd suggest downloading and utilising the final project source code that was used for this video. It is available in the description (for clarity).
Good luck!
@marklong2060 Were you able to solve it?
@@gofooddy5472 no I wasn't able to make it work. I resorted to storing the images to hard disk for now. For this particular app it works okay because its low traffic etc. Will need to revisit this issue later for sure.
How do i connect this to my django website which is deployed using pythonanywhere by watching ur other video? Since my css and js doesn't work there
Hi,
As soon as you've completed Amazon S3 for static and media files. It should work fine on PythonAnywhere. You need to do S3 first and then PythonAnywhere.
How much time it will take for collect static command over aws????
Hi. I cannot say. It all depends on the size of your application with the AWS CLI.
@@CloudWithDjango As check I was using ck editor for text body so it's consuming more time and files.... Post removal of ck editor it's working smoothly...
.Thank you for contribution.... Keep sharing
No worries! Glad that it helped!
Amazing video
Thank you!
i have creates AWS account, websites ask me about my credit card details , and i choose free account tier out of 3 options.
i need to know is this process free or i need to pay ?
Hi,
Please kindly read the article link guide for more insight.
Unknown Error
An unexpected error occurred.
API response
The specified configuration does not exist.
Helpful.....
Hi,
I included the GitHub source code in the description, maybe check this for further reference?
@@CloudWithDjango Thank you, will have a look.
My pleasure! Good luck!
Guys, I have a problem, it worked properly, but I'm using tailwind with django, and that's why when I activate the bucket my css file also goes to s3, and is not reloaded on the page, I already deleted my tailwind css from the bucket but it didn't solve it, I only solve it by removing amazon s3 from the settings. Does anyone know how to solve it?
Hi,
Please ensure your static files are correctly configured to be served from Amazon S3. Check STATIC_URL and STATICFILES_STORAGE settings. Verify CSS files generated by Tailwind are collected and stored in your static files directory, then uploaded to S3. If CSS files aren't updating, check caching settings and invalidate cache when deploying changes. If issues persist, consult documentation or forums for specific guidance.
Good luck!
How can i solve this error: botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden
Hi,
Please ensure that you have no syntax errors with the variable declaration and that you have valid access keys that you are using. This is the main reasons for error.
Can this process also be used for Django Rest Framework API ?
Hi,
The best way to know would be to test it and see
@@CloudWithDjango
Thanks for the reply. I have another question.
Django Rest Framework API and Nextjs combination Is this process the same or different?
My pleasure!
The process of combining Django Rest Framework (DRF) API with Next.js is similar to integrating any frontend framework with a backend API. You would use Next.js to create the frontend, and DRF to develop the backend API. The communication between them typically involves making HTTP requests from Next.js to the DRF endpoints. However, the specific implementation details may vary based on your project requirements and architecture choices. Ensure proper CORS (Cross-Origin Resource Sharing) settings, handle authentication, and manage state as needed in your Next.js application.
@@CloudWithDjangook thank you
My pleasure!
So after all this people can use my webApp online? How do I configure the link so people can go online and use my app ? Thank you
Hi,
I think there is a misunderstanding. In web development static files need to be public as do images if it's publically for all.
This is not a deployment video, so you'd need to watch a tutorial on deployment so people can go online and use your app
Thank you for your response , it’s my first time hosting a website or webApp, now my bucket got created in [us-1] and thats not letting me use the collectstatic
No problem. I'd recommend double checking your steps and ensuring that everything is exact as I have done. It will work
@@CloudWithDjango I’ll try again , thank you !
Good luck! Of course, my pleasure!
no CDN setup?
Hi,
That is optional.
Can you do a video on Customizing Django allauth templates.
Thank you for your suggestion
its paid !!
Hi,
Yes, Amazon S3 is paid depending on your usage limit
hi why am i getting this error when doing 'manage.py collectstatic'. ..... raise ImproperlyConfigured("Could not load Boto3's S3 bindings. %s" % e)
django.core.exceptions.ImproperlyConfigured: Could not load Boto3's S3 bindings. No module named 'six.moves' ...... i do have six package installed
Hi,
I'd suggest that you double check your configuration settings and that there are no syntax issues.
Great guidance, But I am watched the video 10 times 🥲🥲
But i am still getting this error
An error occurred (AccessControlListNotSupported) when calling the PutObject operation: The bucket does not allow ACLs.
This is the policy included in my Bucket
{
"Version": "2012-10-17",
"Id": "Policy1700*******99",
"Statement": [
{
"Sid": "Stmt170*******49",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::ump-ai-tutor-v1/*"
}
]
}
Oww Now its fixed
i changed AWS_DEFAULT_ACL = "public-read" to AWS_DEFAULT_ACL = None
Hi,
Thank you for your feedback! :)
This error is occurring because you have different needs and expectations with your application regarding the use of a PutObject.
You can try to check the following and add the PutObject API action in your JSON policies as such:
{
"Version": "2012-10-17",
"Id": "Policy1700*******99",
"Statement": [
{
"Sid": "Stmt170*******49",
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject",
"s3:PutObject" // Add this line to allow the PutObject action
],
"Resource": "arn:aws:s3:::ump-ai-tutor-v1/*"
}
]
}