Have you seen some backup disks that has just an ethernet port like WD (western digital) with 8TB of capacity? if you don't know some commands to mount them or access its dashbaord, things could get a little bit harder.
LED color in front of the disk:
- blue LED is blinking: getting ready to serve data (if disk is almost full it may take 2 to 3 hours)
- blue LED is on with no blinking: LAN is connected and device is ready.
- red LED is blinking: there is something wrong with the device, maybe bad cluster or something else!
- red LED is not blinking: Device is ready, but LAN cable is not connected.
This LED color and its behaviour is almost always the same on other devices too. So memorize them :)
To get the IP address of WD (My Cloud) ping the domain as below:
ping wdmycloud.local
To see the GUI dashbaord open a browser and head over to:
- http://wdmycloud.local
To mount the device first check that the blue LED is on without blinking on the disk then create the destination directory if not exist:
mkdir -p /mnt/my_backup_folder
Now mount the disk with the command below (you can see folders like MyFolder from dashboard->Shares menu):
sudo mount -t nfs MyCloudIPAddress:/nfs/MyFolder /mnt/my_backup_folder
#wd #my_cloud #backup_disk #western_digital #wdmycloud
LED color in front of the disk:
- blue LED is blinking: getting ready to serve data (if disk is almost full it may take 2 to 3 hours)
- blue LED is on with no blinking: LAN is connected and device is ready.
- red LED is blinking: there is something wrong with the device, maybe bad cluster or something else!
- red LED is not blinking: Device is ready, but LAN cable is not connected.
This LED color and its behaviour is almost always the same on other devices too. So memorize them :)
To get the IP address of WD (My Cloud) ping the domain as below:
ping wdmycloud.local
To see the GUI dashbaord open a browser and head over to:
- http://wdmycloud.local
To mount the device first check that the blue LED is on without blinking on the disk then create the destination directory if not exist:
mkdir -p /mnt/my_backup_folder
Now mount the disk with the command below (you can see folders like MyFolder from dashboard->Shares menu):
sudo mount -t nfs MyCloudIPAddress:/nfs/MyFolder /mnt/my_backup_folder
NOTE:
dashboard default username is admin
and it has no password by default.NOTE:
if you want to reset the password: https://support.wdc.com/knowledgebase/answer.aspx?ID=13986#wd #my_cloud #backup_disk #western_digital #wdmycloud
Western Digital
Western Digital Support | Western Digital
Find detailed answers to your support questions for your Western Digital, SanDisk, WD_BLACK, or WD storage product.
How to get file mimetype in
Now get file mime type like below:
If you do not provide
If you are on
To solve the problem install
Or using
Read more here:
- https://github.com/ahupp/python-magic
#python #python_magic #magic #mimetype #mime #libmagic
Python
?pip install python-magic
Now get file mime type like below:
>>> magic.from_file("testdata/test.pdf", mime=True)
'application/pdf'
If you do not provide
mime=True
:>>> magic.from_file("testdata/test.pdf")
'PDF document, version 1.2'
If you are on
OS X
you may get the below error:ImportError: failed to find libmagic. Check your installation
To solve the problem install
libmagic
using brew
:brew install libmagic
Or using
macport
:port install file
Read more here:
- https://github.com/ahupp/python-magic
#python #python_magic #magic #mimetype #mime #libmagic
GitHub
GitHub - ahupp/python-magic: A python wrapper for libmagic
A python wrapper for libmagic. Contribute to ahupp/python-magic development by creating an account on GitHub.
How to upload file into
Now you just need the region, endpoint and access key, secret key which you would be given after purchase:
#python #object_storage #boto3 #file_upload
Amazon object storage
using boto3
?pip install boto3
Now you just need the region, endpoint and access key, secret key which you would be given after purchase:
client = session.client('s3',
region_name=YOUR_REGION,
endpoint_url=YOUR_HOST,
aws_access_key_id=YOUR_ACCESS_KEY,
aws_secret_access_key=YOUR_SECRET_KEY)
client.upload_file(file_path, # Path to local file
obj_config['spacename'], # Name of Space
'YOUR_FILE_NAME.txt', # Name for remote file
ExtraArgs={"Metadata": {'user-id': USER_ID} }) # metadata
NOTE:
in the name of the file you can pass /
like my/file/here.txt
. Now it will create directory (virtually) in the remote object storage.#python #object_storage #boto3 #file_upload
How to ignore extra fields for schema validation in
Some records currently have extra fields that are not included in my model schema (by error, but I want to handle these cases). When I try to query the DB and transform the records into the schema, I get the following error:
For ignoring this error when having extra fields while getting data, set
#mongodb #mongo #python #mongoengine #strict #FieldDoesNotExist
Mongoengine
?Some records currently have extra fields that are not included in my model schema (by error, but I want to handle these cases). When I try to query the DB and transform the records into the schema, I get the following error:
FieldDoesNotExist
The field 'X' does not exist on the document 'Y'
For ignoring this error when having extra fields while getting data, set
strict
to False
in your meta dictionary.class User(Document):
email = StringField(required=True, unique=True)
password = StringField()
meta = {'strict': False}
#mongodb #mongo #python #mongoengine #strict #FieldDoesNotExist
In
It uses
#mongodb #mongo #duplicates #duplication
MongoDB
you can remove duplicate documents based on a specific field:db.yourCollection.aggregate([
{ "$group": {
"_id": { "yourDuplicateKey": "$yourDuplicateKey" },
"dups": { "$push": "$_id" },
"count": { "$sum": 1 }
}},
{ "$match": { "count": { "$gt": 1 } }}
]).forEach(function(doc) {
doc.dups.shift();
db.yourCollection.remove({ "_id": {"$in": doc.dups }});
});
It uses
aggregation
to group by based on the given key then add its _id
into dups
field and its count in count
field. It will project fields with count of more than 1 using $match
. At the end loops over each document and remove all duplicate fields except the first one (`shift` will cause this behaviour).#mongodb #mongo #duplicates #duplication
Tech C**P
https://eli.thegreenplace.net/2010/06/25/aes-encryption-of-files-in-python-with-pycrypto #python #encryption #file_encryption #pycrypto #AES #IV #CBC
Novixys Software Dev Blog
Using AES for Encryption and Decryption in Python Pycrypto | Novixys Software Dev Blog
Easily incorporate strong AES encryption into your programs.
How to sort data based on a column in
You can use
The above sample assumes that you have a data frame called
#python #pandas #dataframe #sort #inplace
Pandas
?You can use
sort_values
in order to sort data in a dataframe
:df.sort_values(['credit'], ascending=False, inplace=True)
The above sample assumes that you have a data frame called
df
and sort it based on user credit. The sort order is Descending
(ascending=False). and it sorts in place (You don't have to copy the result into a new dataframe).#python #pandas #dataframe #sort #inplace
Forwarded from Alireza Hos.
امشب شب ولادت سادات عالم است/ امشب شب عروسی زهرا و حیدر است
امشب به عرش زمزمه شادی علی است/ امشب شب مبارک دامادی علی است
بنت اسد که بوده ملک دست بوس تو/ عیدی بده که فاطمه گشته عروس تو
امشب به عرش زمزمه شادی علی است/ امشب شب مبارک دامادی علی است
بنت اسد که بوده ملک دست بوس تو/ عیدی بده که فاطمه گشته عروس تو
https://techcrunch.com/2018/08/12/a-private-tesla-backed-by-saudi-arabia-might-not-be-as-far-fetched-as-you-think/
#news #elun #tesla
#news #elun #tesla
TechCrunch
A private Tesla backed by Saudi Arabia might not be as far-fetched as you think
This week the business and tech world was stunned when Elon Musk hinted on August 7, via Twitter of course, that he wanted to take Tesla private. The estimated price tag for such a move is commonly put at up to $72 billion. Shortly after that no ‘white knights’…
How to prepend a string to all file names in a directory in a bash script?
The above one-line will loop over all files in current directory with
So for example a file with name
#python #bash #script #prepend #move #rename #for
for f in *.py; do mv "$f" "old-$f"; done
The above one-line will loop over all files in current directory with
.py
extension and prepend old-
into the files.So for example a file with name
main.py
will be renamed to old-main.py
#python #bash #script #prepend #move #rename #for
How do you know if a remote web server is Linux based or Windows based? (Ignore this if there is pfsense/ha proxy in front)
The output is similar to:
Look at the
#os #linux #windows #ping #ttl #iranhost
ping
the remote ip:ping iranhost.com
The output is similar to:
PING iranhost.com (174.142.214.47): 56 data bytes
64 bytes from 174.142.214.47: icmp_seq=0 ttl=105 time=174.379 ms
64 bytes from 174.142.214.47: icmp_seq=1 ttl=105 time=180.315 ms
64 bytes from 174.142.214.47: icmp_seq=2 ttl=105 time=177.937 ms
Look at the
ttl
part of the output. If it's above 100, it is a windows based machine otherwise it is a Linux based machine.#os #linux #windows #ping #ttl #iranhost
Forwarded from Alireza Hos.
This media is not supported in your browser
VIEW IN TELEGRAM
🔺حمایت کم نظیر مردم ترکیه از پول کشورشان
🔹کاهش 13درصدی ارزش دلار نسبت به لیر از زمان شروع کمپین مردمی
🔹کاهش 13درصدی ارزش دلار نسبت به لیر از زمان شروع کمپین مردمی