Commit 408fa782 authored by Kedar A.'s avatar Kedar A. 💻

Initial pull from replica

parent d2ab8475
......@@ -183,7 +183,33 @@
> git pull origin mongokit # Or Simply
<<<<<<< HEAD
*Following are for mailclient app and Replication features
*Following has to be done for using the Replication features
9. Create a file local_settings.py in /gstudio/gnowsys-ndf/gnowsys_ndf/
In this file add the following variables:
# SMTP setting for sending mail (Using gmail SMTP server)
EMAIL_USE_TLS = True
EMAIL_HOST = ''
EMAIL_PORT = 587
EMAIL_HOST_USER = ''
EMAIL_HOST_PASSWORD = ''
# The following email id and password for the email account will be used for sending/receiving SYNCDATA
SYNCDATA_FETCHING_EMAIL_ID = ''
SYNCDATA_FETCHING_EMAIL_ID_PASSWORD = ''
SYNCDATA_FETCHING_IMAP_SERVER_ADDRESS = ''
SYNCDATA_SENDING_EMAIL_ID = ''
# This is the duration (in secs) at which send_syncdata and fetch_syncdata scripts will be run
SYNCDATA_DURATION = 60
=======
* Edit SCSS/SAAS stylesheet(s)
>>>>>>> 97c0446c897bb394732a2e5290c5dcb2f17342c1
(a) Installing Ruby and compass:
......
[REQUIRED PACKAGES]
Download the package 'django_mailbox' using command:
1. pip install django-mailbox
2. pip install celery
3. sudo apt-get install rabbitmq-server (has to be done at root level)
Download CKEditor with the 'confighelper' plugin added to it and place the same in 'ndf/static/' folder
[IMPORTANT] :
1. The following folders would be created at
parent : 'gnowsys-ndf/gnowsys_ndf/ndf' level:
1.1 MailClient
1.2 MailClient/mailbox_data
1.3 MailClient/mailbox_data/[username]/[mailbox_name]/[cur/new/temp]
1.4 MailClient/mailbox_data/[username]/Archived_Mails
1.5 MailClient/sent_syncdata_files
1.6 MailClient/syncdata
parent : 'gnowsys-ndf
1.1 mailbox_attachments
2. In case of changes made to the same file, the last made change [by anyone] remains.
3. The following varaibles have to be filled in 'local_settings.py':
# SMTP setting for sending mail (Using gmail SMTP server)
EMAIL_USE_TLS = True
EMAIL_HOST = ''
EMAIL_PORT = 587
EMAIL_HOST_USER = ''
EMAIL_HOST_PASSWORD = ''
# The following variables are for email id and password for the email account which will be used for receiving SYNCDATA mails
SYNCDATA_FETCHING_EMAIL_ID = ''
SYNCDATA_FETCHING_EMAIL_ID_PASSWORD = ''
SYNCDATA_FETCHING_IMAP_SERVER_ADDRESS = ''
# Mailing-list ID (ie to this id syncdata mails will be sent)
SYNCDATA_SENDING_EMAIL_ID = (
'',
)
#While sending syncdata mails the from field of the mail is set by this variable
SYNCDATA_FROM_EMAIL_ID =''
# sample: 'Gstudio <t.metastudio@gmail.com>'
# This is the duration (in secs) at which send_syncdata and fetch_syncdata scripts will be run
SYNCDATA_DURATION = 60
#SIGNING KEY Pub. Fill the pub of the key with which to sign syncdata mails here
SYNCDATA_KEY_PUB = ''
[INSTRUCTIONS]
1. Fill in details in gstudio/key_script/gen_key_inp.txt
2. Run $ python manage.py generate_keys
3. Put the key pub of this key in local_settings.py
4. Fill other variables of local_settings.py for sending and receiving syncdata mails
5. Give the public to other systems and take public keys from other systems involved in the program
INSTRUCTIONS TO IMPORT KEYS
1. When the public key file 'key.pub' is in pwd run:
$ gpg --import key.pub
2. Run:
$ gpg --list-keys
All the keys will be listed, including the one you just imported. Copy the 'pub' of this newly imported key. As an example, assume the following output appears after running the above command. Further assume that this is the key we want to import:
/home/sample_user/.gnupg/pubring.gpg
--------------------------------
pub 2048R/963E2E69 2015-06-29
uid SchoolServer (ss) <gstudio.ss@gmail.com>
So what we have to copy is '963E2E69'
3. Key has been imported but that is not enough. We must sign it with our own private key to mark it as trusted. To do so, run:
$ gpg --edit-key 963E2E69
Note: we pasted the key pub here
4. A prompt 'gpg> ' will appear. Here type:
gpg> sign
5. Enter 'y' when prompted to confirm
7. Enter 'q' hit enter and then 'y' and hit enter to save the changes and exit.
INSTRUCTIONS FOR MANUALLY SENDING/ RECEIVING SYNCDATA
1. A change on metastudio is defined as : adding/editing page , file or forum.
2. From the system on which the new changes were made run:
$ python manage.py send_syncdata
3. On the system on which the new changes are to be fetched run:
$ python manage.py fetch_syncdata
INSTRUCTIONS FOR AUTOMATICLLY SENDING/ RECEIVING SYNCDATA (WILL USE celery and rabbitmq)
1. A change on metastudio is defined as : adding/editing page , file or forum.
2. Check that SYNCDATA_DURATION variable in local_settings.py is assigned.
3. On all involved systems, from the directory 'gstudio/gnowsys-ndf' run :
$ celery -A gnowsys_ndf worker -B -l info
This will start the celery beat scheduler which will run scripts 'send_syncdata' and 'fetch_syncdata' every SYNCDATA_DURATION seconds
[ISSUES]
1. The mailbox's password can't have '#'
2. Unable to capture node-data for 'Themes' + 'Tasks' group properly because of associated Triples
[TO BE TESTED]:
1. Handle cases where attachments received in syncdata mails are not signed or are signed by keys nt in our web of trust
[FUTURE SCOPE]
1. Displaying the sent mails + Deleted Mails + Assigning the labels on the mails + Drafts
2. While descrypting if failure occurs delete the attachment
** REQUIRED PACKAGES
Download the package 'django_mailbox' using command:
1. pip install django-mailbox
2. pip install celery
3. sudo apt-get install rabbitmq-server (has to be done at root level)
Download CKEditor with the 'confighelper' plugin added to it and place the same in 'ndf/static/' folder
** IMPORTANT
1. The following folders would be created at
parent: 'gnowsys-ndf/gnowsys_ndf/ndf' level:
1. MailClient
2. MailClient/mailbox_data
3. MailClient/mailbox_data/[username]/[mailbox_name]/[cur/new/temp]
4. MailClient/mailbox_data/[username]/Archived_Mails
5. MailClient/sent_syncdata_files
6. MailClient/syncdata
parent : 'gnowsys-ndf
1. mailbox_attachments
2. In case of changes made to the same file, the last made change [by anyone] remains.
3. The following varaibles have to be filled in 'local_settings.py':
#+BEGIN_EXAMPLE
# SMTP setting for sending mail (Using gmail SMTP server)
EMAIL_USE_TLS = True
EMAIL_HOST = ''
EMAIL_PORT = 587
EMAIL_HOST_USER = ''
EMAIL_HOST_PASSWORD = ''
# The following variables are for email id and password for the email account which will be used for receiving SYNCDATA mails
SYNCDATA_FETCHING_EMAIL_ID = ''
SYNCDATA_FETCHING_EMAIL_ID_PASSWORD = ''
SYNCDATA_FETCHING_IMAP_SERVER_ADDRESS = ''
# Mailing-list ID (ie to this id syncdata mails will be sent)
SYNCDATA_SENDING_EMAIL_ID = (
'',
)
#While sending syncdata mails the from field of the mail is set by this variable
SYNCDATA_FROM_EMAIL_ID =''
# sample: 'Gstudio <t.metastudio@gmail.com>'
# This is the duration (in secs) at which send_syncdata and fetch_syncdata scripts will be run
SYNCDATA_DURATION = 60
#SIGNING KEY Pub. Fill the pub of the key with which to sign syncdata mails here
SYNCDATA_KEY_PUB = ''
#+END_EXAMPLE
** INSTRUCTIONS
1. Fill in details in gstudio/key_script/gen_key_inp.txt
2. Run $ python manage.py generate_keys
3. Put the key pub of this key in local_settings.py
4. Fill other variables of local_settings.py for sending and receiving syncdata mails
5. Give the public to other systems and take public keys from other systems involved in the program
*** INSTRUCTIONS TO IMPORT KEYS
1. When the public key file 'key.pub' is in pwd run:
#+BEGIN_EXAMPLE
$ gpg --import key.pub
#+END_EXAMPLE
2. Run:
#+BEGIN_EXAMPLE
$ gpg --list-keys
#+END_EXAMPLE
All the keys will be listed, including the one you just imported. Copy the 'pub' of this newly imported key. As an example, assume the following output appears after running the above command. Further assume that this is the key we want to import:
/home/sample_user/.gnupg/pubring.gpg
--------------------------------
pub 2048R/963E2E69 2015-06-29
uid SchoolServer (ss) <gstudio.ss@gmail.com>
So what we have to copy is '963E2E69'
3. Key has been imported but that is not enough. We must sign it with our own private key to mark it as trusted. To do so, run:
#+BEGIN_EXAMPLE
$ gpg --edit-key 963E2E69
#+END_EXAMPLE
Note: we pasted the key pub here
4. A prompt 'gpg> ' will appear. Here type:
#+BEGIN_EXAMPLE
gpg> sign
#+END_EXAMPLE
5. Enter 'y' when prompted to confirm
7. Enter 'q' hit enter and then 'y' and hit enter to save the changes and exit.
INSTRUCTIONS FOR MANUALLY SENDING/ RECEIVING SYNCDATA
1. A change on metastudio is defined as : adding/editing page , file or forum.
2. From the system on which the new changes were made run:
#+BEGIN_EXAMPLE
$ python manage.py send_syncdata
#+END_EXAMPLE
3. On the system on which the new changes are to be fetched run:
#+BEGIN_EXAMPLE
$ python manage.py fetch_syncdata
#+END_EXAMPLE
INSTRUCTIONS FOR AUTOMATICLLY SENDING/ RECEIVING SYNCDATA (WILL USE celery and rabbitmq)
1. A change on metastudio is defined as : adding/editing page , file or forum.
2. Check that SYNCDATA_DURATION variable in local_settings.py is assigned.
3. On all involved systems, from the directory 'gstudio/gnowsys-ndf' run :
#+BEGIN_EXAMPLE
$ celery -A gnowsys_ndf worker -B -l info
#+END_EXAMPLE
This will start the celery beat scheduler which will run scripts 'send_syncdata' and 'fetch_syncdata' every SYNCDATA_DURATION seconds
[ISSUES]
1. The mailbox's password can't have '#'
2. Unable to capture node-data for 'Themes' + 'Tasks' group properly because of associated Triples
[TO BE TESTED]:
1. Handle cases where attachments received in syncdata mails are not signed or are signed by keys nt in our web of trust
[FUTURE SCOPE]
1. Displaying the sent mails + Deleted Mails + Assigning the labels on the mails + Drafts
2. While descrypting if failure occurs delete the attachment
from __future__ import absolute_import
from celery import Celery
#os.environ.setdefault('DJANGO_MAILBOX', 'Local')
app = Celery('gnowsys_ndf',
include=['gnowsys_ndf.tasks'])
app.config_from_object('gnowsys_ndf.celeryconfig')
if __name__ == '__main__':
app.start()
from datetime import timedelta
from gnowsys_ndf.local_settings import SYNCDATA_DURATION
BROKER_URL = 'amqp://'
CELERYBEAT_SCHEDULE = {
'do-every-fixed-seconds': {
'task': 'gnowsys_ndf.tasks.run_syncdata_script',
'schedule': timedelta(seconds=SYNCDATA_DURATION),
#'args': (16, 16)
},
}
CELERY_TIMEZONE = 'UTC'
......@@ -306,5 +306,3 @@ relation_types = [{'_type':'RelationType','name':'group_has_course_event'},
]
......@@ -29,6 +29,4 @@ class UserChangeform(PasswordChangeForm):
new_password1 = PasswordField(label="New password")
class UserResetform(SetPasswordForm):
new_password1 = PasswordField(label="New password")
new_password1 = PasswordField(label="New password")
......@@ -54,7 +54,43 @@ def get_gsystems(GSystemTypeList):
var = {"name":{"$in":GSystemTypeList}}
Systemtypes = node_collection.find(var)
Systemtypelist = [i._id for i in Systemtypes ]
'''
for i in Systemtypes:
if i.name != "Event":
Systemtypelist.append(i._id)
output = GSystem_node(i)
if output:
lisst.extend(output)
#Systemtypes.reload()
#exceptionlist = ['Event']
Systemtypelist = [i._id for i in Systemtypes if i.name not in exceptionlist ]
Systemtypelist.extend(lisst)
'''
'''
cmd = "mongoexport --db studio-dev --collection Nodes -q '" + '%s' % var + "' --out Schema/GSystemType.json"
subprocess.Popen(cmd,stderr=subprocess.STDOUT,shell=True)
'''
return Systemtypelist
def GSystem_node(node):
Gsystem_type_defination = []
for i,v in node.items():
print i,v
if type(v) == list:
if v is not None:
for j in v:
if type(j) == list:
print "one more listing zone", i,j
if isinstance(j,type(node_collection.collection.RelationType())) or isinstance(j,type(node_collection.collection.AttributeType())):
print "asdfafas" ,j._id
Gsystem_type_defination.append(j._id)
if isinstance(j,ObjectId):
if ObjectId(j) not in Gsystem_type_defination:
Gsystem_type_defination.append(ObjectId(j))
return Gsystem_type_defination
def get_relationtypes(RelationTypeList):
......@@ -121,46 +157,50 @@ def make_rcs_dir(final_list):
for i in final_list:
#get rcs files path and copy them to the current dir:
if type(i)!= int:
a = node_collection.find_one({"_id":ObjectId(i)})
a = node_collection.find_one({"_id":ObjectId(i)})
if a:
rel_path = hr.get_file_path(a)
path = rel_path + ",v"
#cp = "cp -u " + path + " " +" --parents " + rcs_path + "/"
#subprocess.Popen(cp,stderr=subprocess.STDOUT,shell=True)
if a:
filter_list = ['GSystemType','RelationType','AttributeType','MetaType']
if a._type in filter_list:
try:
file_node = get_version_document(a,rel_path)
except:
a.save()
file_node = get_version_document(a,rel_path)
if a._type == 'GSystemType':
GSystemtypenodes.append(file_node)
elif a._type == 'RelationType':
Relationtypenodes.append(file_node)
elif a._type == 'AttributeType':
Attributetypenodes.append(file_node)
elif a._type == 'MetaType':
metatype.append(file_node)
elif a._type not in filter_list:
if a._type == "Group":
with open('file_log.txt', 'a') as outfile:
outfile.write(a.name)
outfile.write(" ")
outfile.write(a._type)
outfile.write("\n")
if a:
rel_path = hr.get_file_path(a)
path = rel_path + ",v"
#cp = "cp -u " + path + " " +" --parents " + rcs_path + "/"
#subprocess.Popen(cp,stderr=subprocess.STDOUT,shell=True)
if a:
filter_list = ['GSystemType','RelationType','AttributeType','MetaType']
if a._type in filter_list:
try:
file_node = get_version_document(a,rel_path,'1.1')
except:
#if rcs doesn't exist in the system then create it.
a.save()
file_node = get_version_document(a,rel_path,'1.1')
else:
try:
file_node = get_version_document(a,rel_path)
file_node = get_version_document(a,rel_path)
except:
a.save()
file_node = get_version_document(a,rel_path)
factorydatanode.append(file_node)
file_node = get_version_document(a,rel_path)
if a._type == 'GSystemType':
GSystemtypenodes.append(file_node)
elif a._type == 'RelationType':
Relationtypenodes.append(file_node)
elif a._type == 'AttributeType':
Attributetypenodes.append(file_node)
elif a._type == 'MetaType':
metatype.append(file_node)
elif a._type not in filter_list:
if a._type == "Group":
try:
file_node = get_version_document(a,rel_path,'1.1')
except:
a.save()
file_node = get_version_document(a,rel_path,'1.1')
else:
try:
file_node = get_version_document(a,rel_path)
except:
a.save()
file_node = get_version_document(a,rel_path)
factorydatanode.append(file_node)
#start making catalog
make_catalog(GSystemtypenodes,'GSystemType')
......
......@@ -24,7 +24,7 @@ class Command(BaseCommand):
#db = get_database()
#files = db['fs.files']
#chunks = db['fs.chunks']
#follow the recursive stretagy go till the depth of the node take their induvidual dumps and then finally of the main groups
#follow the recursive stretagy go till the depth of the snode take their induvidual dumps and then finally of the main groups
Group_node = node_collection.find_one({"_type":"Group","name":unicode(Group_name)})
if Group_node:
#all the nodes having Group id
......@@ -68,7 +68,6 @@ class Command(BaseCommand):
value_node = triple_collection.find_one({"_type": "GRelation", "subject": i._id, "relation_type.$id": rel_node._id})
if value_node:
print value_node
Grelations_ids.append(value_node._id)
if type(attr.object_value) not in [datetime.datetime,unicode,int] :
try:
......@@ -84,7 +83,7 @@ class Command(BaseCommand):
f_list = [ObjectId(i) for i in fs_file_ids ]
at_files = '{"_id":{"$in":%s}}' % Gattribute_value
at_files = at_files.replace("'",'"')
cmd = "mongoexport --db "+ db_name + " --collection Nodes -q '" + '%s' % at_files + "' --out backup/at_files.json"
cmd = "mongoexport --db "+ db_name + " -- collection Nodes -q '" + '%s' % at_files + "' --out backup/at_files.json"
commandlist.append(cmd)
......
This diff is collapsed.
from django.core.management.base import BaseCommand, CommandError
import os
from subprocess import call
from django.core.mail import EmailMessage
class Command(BaseCommand):
help = 'Will call the script to generate public, private key pair. Please ensure you have edited ../gstudio/key_test/gen_key_inp.txt before running this script'
def handle(self, *args, **kwargs):
path0 = os.path.dirname(__file__).split('/gstudio')[0] + '/gstudio/key_script/'
script_name = 'gen_key_script.sh'
script_input_file_name = 'gen_key_inp.txt'
path1 = path0 + script_name
path2 = path0 + script_input_file_name
# pass path2 as argument to be used by script
# need to pass path of gen_key_input.txt to the gen_key_script.sh so that it can read it because the bash
# command will be run from /gnowsys-ndf as:
# $ python manage.py generate_keys
command = 'bash'+ ' ' + path1 + ' ' + path2
print "command",command
call([command],shell=True)
from django.core.management.base import BaseCommand, CommandError
class Command(BaseCommand):
help = 'This fetches mails from a mailing list and if subscribed then also updates the same on the server'
def handle(self, *args, **kwargs):
print 'It does work'
\ No newline at end of file
import os
import datetime
from django.core.management.base import BaseCommand, CommandError
from pymongo import Connection
class Command(BaseCommand):
def handle(self,*args,**options):
#rotate the log when ever the maintanence script is executed
#os.path.getsize('/var/log/mongodb/mongod.log')
conn = Connection()
database = conn['admin']
database.command({"logRotate":1})
#Mongo log Rotated
print "Mongo Log Rotated."
#Rotate Registry and Recived File
#code to create folders in /data Registry
if not os.path.exists('/data/Registry'):
os.mkdir('/data/Registry')
if not os.path.exists('/data/receiveddata'):
os.mkdir('/data/receiveddata')
#code written with docker in consideration
Registry_Log_path = '/data/Registry'
Recived_data_path = '/data/receiveddata'
root_path = os.path.abspath(os.path.dirname(os.pardir))
tym_scan = os.path.join(root_path, 'Last_Scan.txt')
if os.path.exists(tym_scan):
file_output = open(tym_scan)
last_scan = file_output.readline()
index = last_scan.find(":")
str1 = last_scan[index+1:]
time = str1.strip("\t\n\r ")
#get timestamp to append it to file name
timestamp = datetime.datetime.now()
timestamp = timestamp.isoformat()
registry_path = os.path.join(root_path, 'Registry.txt')
filename = "Registry.txt." + str(timestamp)
receivedfile = os.path.join(root_path, 'receivedfile')
recv_filename = "receivedfile." + str(timestamp)
filename = os.path.join(Registry_Log_path,filename)
recv_filename = os.path.join(Recived_data_path,recv_filename)
#time = time - datetime.timedelta(minutes=5)
#Rotate Registry File
cmd = os.popen("cat %s|awk '$0 < \"%s\"' > %s" % (str(registry_path),str(time),str(filename)))
cmd.close()
d = os.popen("sed -i '1,/$TIME/d' %s" % str(registry_path) )
d.close()
print "Registry.txt Rotated."
#Rotate recived file information
cmd = os.popen("cat %s|awk '$0 < \"%s\"' > %s" % (str(receivedfile),str(time),str(recv_filename)))
cmd.close()
d = os.popen("sed -i '1,/$TIME/d' %s" % str(receivedfile) )
d.close()
print "Receivedfile Rotated."
......@@ -7,63 +7,73 @@ import json
class Command(BaseCommand):
help = "setup the initial database"
def handle(self,*args,**options):
file_names = ['metatype','AttributeType','RelationType','GSystemType','factorydata']
for i in file_names:
json_documents_list = []
PROJECT_ROOT = os.path.abspath(os.path.dirname(os.pardir))
refcatpath = os.path.join(PROJECT_ROOT + '/GRef.cat/' + i)
path_val = os.path.exists(refcatpath)
if path_val:
for dir_, _, files in os.walk(refcatpath):
for filename in files:
filepath = os.path.join(dir_, filename)
with open(filepath, 'r') as json_file:
json_data = json_file.read()
json_documents_list.append(json.loads(json_data))
parsed_json_document = []
for j,json_data in enumerate(json_documents_list):
converted_data = node_collection.from_json(json.dumps(json_data))
#check if node already exist
existing_node = node_collection.one({"_id":ObjectId(converted_data["_id"])})
if existing_node is None:
if i == 'GSystemType':
node = node_collection.collection.GSystemType()
elif i == 'factorydata':
if json_data['_type'] == 'Group':
node = node_collection.collection.Group()
else:
node = node_collection.collection.GSystem()
elif i == 'metatype':
node = node_collection.collection.MetaType()
elif i == 'RelationType':
node = node_collection.collection.RelationType()
elif i == 'AttributeType':
node = node_collection.collection.AttributeType()
if converted_data['name']:
converted_data['modified_by'] = 1
converted_data['created_by'] = 1
for key, values in converted_data.items():
if values and type(values) == list:
oid_ObjectId_list = []
oid_list_str = values.__str__()
if key == 'attribute_type_set' or key == 'relation_type_set' or key == 'meta_type_set':
value = []
if values is not None:
value = process_list(values)
node[key] = value
elif '$oid' in oid_list_str:
for oid_dict in values:
oid_ObjectId = ObjectId(oid_dict['$oid'])
oid_ObjectId_list.append(oid_ObjectId)
node[key] = oid_ObjectId_list
else:
if key != "_type":
node[key] = values
node.save()
else:
print "Node already present !!!"
file_names = ['metatype','AttributeType','RelationType','GSystemType','factorydata']
print "error"
for i in file_names:
json_documents_list = []
PROJECT_ROOT = os.path.abspath(os.path.dirname(os.pardir))
refcatpath = os.path.join(PROJECT_ROOT + '/GRef.cat/' + i)
path_val = os.path.exists(refcatpath)
print path_val,refcatpath
if path_val:
for dir_, _, files in os.walk(refcatpath):
for filename in files:
filepath = os.path.join(dir_, filename)
with open(filepath, 'r') as json_file:
json_data = json_file.read()
json_documents_list.append(json.loads(json_data))
parsed_json_document = []
for j,json_data in enumerate(json_documents_list):
converted_data = node_collection.from_json(json.dumps(json_data))
#check if node already exist
existing_node = node_collection.one({"_id":ObjectId(converted_data["_id"])})
if existing_node is None:
if i == 'GSystemType':
node = node_collection.collection.GSystemType()
elif i == 'factorydata':
if json_data['_type'] == 'Group':
node = node_collection.collection.Group()
else:
node = node_collection.collection.GSystem()
elif i == 'metatype':
node = node_collection.collection.MetaType()
elif i == 'RelationType':
node = node_collection.collection.RelationType()
elif i == 'AttributeType':
node = node_collection.collection.AttributeType()
if converted_data['name']:
converted_data['modified_by'] = 1
converted_data['created_by'] = 1