Search Results

Search found 33257 results on 1331 pages for 'django database'.

Page 37/1331 | < Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >

  • Postgresql - one database for everyone, or one-database per customer

    - by user337876
    I'm working on a web-based business application where each customer will need to have their own data (think basecamphq.com type model) For scalability and ease-of-upgrades, I'd prefer to have a single database where each customer gets a filtered version of the data. The problem is how to guarantee that they stay sandboxed to their own data. Trying to enforce it in code seems like a disaster waiting to happen. I know Oracle has a way to append a where clause to every query based on a login id, but does Postgresql have anything similar? If not, is there a different design pattern I could use (like creating a view of each table for each customer that filters)? Worse case scenario, what is the performance/memory overhead of having 1000 100M databases vs having a single 1Tb database? I will need to provide backup/restore functionality on a per-customer basis which is dead-simple on a single database but quite a bit trickier if they are sharing the database with other customers.

    Read the article

  • SQL Server replication - how to sync tables from internal database to read-only website database

    - by frankadelic
    I have an internal SQL Server 2005 database "ADMIN_DATA" that is used by admin users. We would like to sync three of the database tables in ADMIN_DATA out to another SQL Server 2005 database "WEB_DATA", which is used by a public web app. WEB_DATA is read-only - only SELECT statements are allowed, while ADMIN_DATA is updated all the time. What is the best solution? How can this be accomplished with minimal custom coding and/or changes to database tables? Notes: ADMIN_DATA and WEB_DATA are different physical machines and on different subnets. The syncing operation doesn't need to be instantaneous.

    Read the article

  • VS2010 ultimate DataBase Project + SQL Server 2008: Updating Project from Database

    - by josecortesp
    Hello everyone, this is a quick question: I have this Database Project in a Web app solution with the real database. I want to update the database using SQL server managment studio, and then update the corresponding VS project. This can be done? I know that you can update the scripts in the project, but I'm not the SQL kind of guy, i rather do this changes using Mangment studio... Thanks in advance

    Read the article

  • Server database -> client database update based on version

    - by user296191
    Hi, What is the recommended method of collecting items in a server database, versioning the database then deploying only the version differences to a client ? Should it by a field in the table (ie. Version: 3.3.9876) against each record ? Should it be DateTime (server based) in each record ? And whats the best way to just deploy the changes to a client with an older version of the database ? Is it a DUMP to a file with a Bulk import of some description ? Open to comments.. Suggestions. Database can be anything (firebird, mysql, sqlserver, sqlite)... Any info greatly appreciated.

    Read the article

  • Transferring a flat file database to a MySQL database

    - by Jon
    I have a flat file database (yeah gross I know - the worst part is that it's 1.4GB), and I'm in the process of moving it to a MySQL database. The problem is that I'm not sure how to go about doing this - and I've checked through every related question on here but none relate to what I want to do, nor how my database is currently setup. My current flat file database is setup to where a normal MySQL row is its own file, and a MySQL table would be the directory. So for example if you have a user named Jon, there would be a file for the user in a directory named /members/. Within that file would be various information for the user including the users id, rank etc - all separated by tabs, all on separate lines (userid\t4). So here's an example user file: userid 4 notes staff notes: bla bla staff2 notes: bla bla bla username Example So how can I convert the above into their own rows and fields in MySQL? And if possible, could I do thousands of these files at once? Thanks.

    Read the article

  • Django model field value preprocessing before returning

    - by Satoru.Logic
    Hi, all. I have a Note model class like this: class Note(models.Model): author = models.ForeignKey(User, related_name='notes') content = NoteContentField(max_length=256) NoteContentField is a custom sub-class of CharField that override the to_python method in purpose of doing some twitter-text-conversion processing. class NoteContentField(models.CharField): __metaclass__ = models.SubfieldBase def to_python(self, value): value = super(NoteContentField, self).to_python(value) from ..utils import linkify return mark_safe(linkify(value)) However, this doesn't work. When I save a Note object like this: note = Note(author=request.use, content=form.cleaned_data['content']) The conversed value is saved into the database, which is not what I wanna see. Would you please tell me what's wrong with this? Thanks in advance.

    Read the article

  • Django model data consistency

    - by Mark
    When creating a form, you can define a bunch of methods, clean_xyz, to make sure the data gets forced into the correct format. Is there any way to do this on a model level? Perhaps I can override the field setters somehow? I want it so that if I write something like my_address.postal_code = 'a1b2c3' It will automatically get formatted into A1B 2C3. Perhaps throw an exception if it can't be converted. That way I know I'll never have any malformed data in the database.

    Read the article

  • Sun Oracle Database Machine a román Banca Transilvaniánál

    - by Fekete Zoltán
    Oracle sajtóhír: Banca Transilvania, first institution in Romania to use Sun Oracle Database Machine (English version) Sikersztori, ügyféltörténet pdf-ben. Az Database Machine V2 megjelenését 2009 szeptemberben jelentette az Oracle. A világon az elso bank, ahol már élesben muködik a Database Machine V2, a romániai Banca Transilvania! Olvassa el a sajtóhírt. A Banca Transilvania 1,5 milló ügyféllel rendelkezik. "This system, product of Oracle and Sun, is the fastest server in the world for data storage, online transactions processing and data warehousing applications." Robert C. Rekkers, Banca Transilvania CEO, ezt nyilatkozta:"Business information is accessed 30 times faster using the new system, leading to quicker decisions and a better data base segmentation", azaz a Database Machine segítségével az üzleti kérséseket 30-szor gyorsabban tudják megválaszolni, mint a korábbi rendszerrel. Leontin Toderici, Banca Transilvania COO mondta a következot: "The acquisition price was excellent, as the costs were below those of an ordinary system", azaz a rendszer ára kiváló volt, kisebb volt a kötsége, mint a hagyományos rendszereké. Sorin Mindrutescu, az Oracle Romania vezetoje büszke arra, hogy egy romániai cég is az innovatív rendszer felhasználói között lehet.: "Oracle Exadata V2 is the result of over 30 years of experience in hardware and software development of two leader companies. I am glad that a top Romanian company is amongst the first in the world to use this innovative product." Az Exadata termékcsalád és a Database Machine kiváló eszköz OLTP rendszerek, adattárházak, konszolidációs megoldások adatbázisainak futtatására. Egy csomagban a tartalmazza a szoftvert és az "okos" hardvert, az adatfeldoldozó, a tároló (storage) komponenseket, mindezt az extrém gyors Infiniband kapcsolatokkal összekötve. A Banca Transilvani az Oracle readingi (Nagy-Britannia) központjában tesztelte a Database Machine rendszert, s a korábbi rendszernél tízszer, néhol hetvenkettoször gyorsabb teljesítményt kaptak, 10-72-szeres teljesítménynövekedés!, említette Tudor Iliescu, Trend Import - Export CEO. A központi Oracle sajtóhír: Customers Select Oracle® Exadata for Extreme Performance of Data Warehouse and OLTP Applications

    Read the article

  • Oracle Database 11g R2 támogatott SAP alatt is

    - by Lajos Sárecz
    Húsvét óta már SAP alatt is használható az Oracle Database 11g R2. Köztudott, hogy az SAP csak a Release 2-re ad ki támogatást, így ez most egy igazán örömteli hír az SAP felhasználóknak, hiszen az alábbi 11g R2 újdonságokat tudják alkalmazni SAP környezetben: • Advanced Compression opció (táblára, RMAN mentésre, expdp-re, Data Guard hálózatra) • Real Application Testing • Oracle Database 11g Release 2 Database Vault • Oracle Database 11g Release 2 RAC • Advanced Encryption táblaterekre, RMAN mentésekre, expdp-re, Data Guard hálózatra • Direct NFS • Deferred Segments • Online Patching Azaz például tömöríthetové válik az SAP adatbázisa, vagy az abból készített mentések. Az eddigi tapasztalatok szerint a tömörítés aránya adatbázistól függoen 2-4-szeres. Az adatbázis upgrade és minden egyéb adatbázis infrastruktúrát érinto változatatás kockázata jelentosen csökkentheto lesz a Real Application Testing alkalmazásával. A rendszergazdai szerepkörök szeparaláhatóvá válnak a Database Vault felhasználásával. A Real Application Clusters 11g R2 újdonságai is elérheto lesznek. A Transparent Data Encryption révén a táblaterek és a mentések titkosíthatók úgy, hogy az alkalmazás számára mindez transzparens, azonban a médiához közvetlenül hozzáférve nem lesznek visszafejthetok az adatok. Támogatott lesz a Direct NFS kliens, ezzel NFS elérési sebesség jelentosen javul. A Deffered Segments révén pedig a tábla szegmensek csak akkor kerülnek lefoglalásra, amikor adat kerül a táblába. Ez azért hasznos, mert általában alkalmazások telepítésekor létrejön minden tábla, azonban sok táblába nem kerül adat. Ezáltal mind a telepítés ideje, mind az adatbázis mérete csökkentheto. Az Online Patching pedig lehetové teszi a leállításmentes patch telepítést. Hát azt gondolom ezek vonzó lehetoségek, érdemes betervezni a közeljövobe az SAP rendszerek alatti adatbázis frissítését, hiszen a 10g verzió Premier Support idén nyáron lejár. Az upgrade-hez pedig mindenképp javaslom a Real Application Testing használatát, amivel az éles terhelés mellett teszthelheto teszt környezetben az upgrade. A Sun Oracle Database Machine és az Exadata sajnos még nem támogatott SAP alatt, mivel az ASM certifikáció még nem zárult le. A hírek szerint 2011 elejére várható, hogy ez megtörténik.

    Read the article

  • New P6 Reporting Database R2

    - by mark.kromer
    Along with our announced GA release of P6 Analytics R1 recently, you may have noticed that when you purchase P6 Analytics, we provide a restricted use license for P6 Reporting Database R2. This represent an updated version of the previous P6 Reporting Database 6.2 and can be purchased individually on a per-CPU basis. Typically, you will want just the reporting database if you would like the P6 data warehouse components such as the ETL, data models, ODS and star schemas in order to report on that data with another reporting tool other than Oracle. The P6 Analytics solution will only work on Oracle BI (OBI). But I pasted below some examples of a simplistic matrix report that I built from the P6 Reporting Database using Microsoft SQL Server Reporting Services. This is the Report Builder tool which is very similar to other similar tools to build reports on the market today such as Crystal Reports or Oracle BI Publisher. This is an example of what you can do (in a very simple format) by using the P6 Reporting Database without P6 Analytics: Here is a quick run-down of some of the key new features in P6 Reporting Database R2 that were added as enhancements to the 6.2 version: • 4 new star schemas (improved projects star, project history, resource utilization and resource allocation) • Improved ETL performance and reliability • P6 security is inherited at the star schema level • Custom P6 project, activity & resource codes are now available as customizable dimensions in the star schemas • Time-phase data down to the data is now available from the star schemas • An updated Operational Data Store (ODS) for operational reporting that includes the WBS hierarchy • The ODS now includes daily spreads for activity and resource assignments

    Read the article

  • SQL SERVER – Installing AdventureWorks Sample Database – SQL in Sixty Seconds #010 – Video

    - by pinaldave
    SQL Server has so many enhancements and features that quite often I feel like playing with various features and try out new things. I often come across situation where I want to try something new but I do not have sample data to experiment with. Also just like any sane developer I do not try any of my new experiments on production server. Additionally, when it is about new version of the SQL Server, there are cases when there is no relevant sample data even available on development server. In this kind of scenario sample database can be very much handy. Additionally, in many SQL Books and online blogs and articles there are scripts written by using AdventureWork database. The often receive request that where people can get sample database as well how to restore sample database. In this sixty seconds video we have discussed the same. You can get various resources used in this video from http://bit.ly/adw2012. More on Errors: SQL SERVER – Install Samples Database Adventure Works for SQL Server 2012 SQL SERVER – 2012 – All Download Links in Single Page – SQL Server 2012 SQLAuthority News – SQL Server 2012 – Microsoft Learning Training and Certification SQLAuthority News – Download Microsoft SQL Server 2012 RTM Now I encourage you to submit your ideas for SQL in Sixty Seconds. We will try to accommodate as many as we can. If we like your idea we promise to share with you educational material. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • Large invoice database structure and rendering

    - by user132624
    Our client has a MS SQL database that has 1 million customer invoice records in it. Using the database, our client wants its customers to be able to log into a frontend web site and then be able to view, modify and download their company’s invoices. Given the size of the database and the large number of customers who may log into the web site at any time, we are concerned about data base engine performance and web page invoice rendering performance. The 1 million invoice database is for just 90 days sales, so we will remove invoices over 90 days old from the database. Most of the invoices have multiple line items. We can easily convert our invoices into various data formats so for example it is easy for us to convert to and from SQL to XML with related schema and XSLT. Any data conversion would be done on another server so as not to burden the web interface server. We have tentatively decided to run the web site on a .NET Framework IIS web server using MS SQL on MS Azure. How would you suggest we structure our database for best performance? For example, should we put all the invoices of all customers located within the same 5 digit or 6 digit zip codes into the same table? Or could we set up a separate home directory for each customer on IIS and place each customer’s invoices in each customer’s home directory in XML format? And secondly what would you suggest would be the best method to render customer invoices on a web page and allow customers to modify for best performance? The ADO.net XML Data Set looks intriguing to us as a method, but we have never used it.

    Read the article

  • Best approach for a clinic database

    - by user18013
    As a practical assignment for the database course I'm taking I've been instructed to create a database for a local clinic, I've meet with the doctors a couple of times and discussed the information that needs to be stored in the database from personal to medical. Now I'm facing a tough decision because I've been given two choices: either to implement the database as a "local website" which only operates inside the clinic via WiFi, or to implement the front-end as a regular desktop application connecting to a shared database. Note: I've a 40 days deadline to deliver the first prototype and meet with my client. My questions are: 1- which approach should I go with given that I've more experience with desktop applications programming than web? 2- if I go with desktop front-ends what would be the best way to synchronize the database between all clients?? I've no experience and having searched for an answer a lot but came up with nothing detailed on this matter. 3- if I go with the web solution which choice would be best PHP & MySQL or ASP.NET & SQL Server or a different combination?? (given that my knowledge in both PHP & ASP.NET are nearly the same).

    Read the article

  • Need database selection advise

    - by jacknad
    I know this is considered a bad question since there is no correct answer, but I need to decide on a database for embedded linux (DaVinci 368 based) hardware and I've never had to produce a design with a database before. Each record will probably contain less than 1000 images with associated alpha-numeric data and the mass storage will be some kind of flash drive. Only one user needs access to the data at a time. MySQL claims to be "The world's most popular open source database" but SQLite claims to be "the most widely deployed SQL database engine in the world." Perhaps there is another that is also the best in the world? Which is easiest to use for a database newbie? Should I just flip a coin? Does it really matter which one I pick? Do I even need to use a database software package or should I roll my own? I won't need bells and whistles like sorting, but I'll probably need to delete the oldest records to make room for new ones if the storage fills up.

    Read the article

  • ???: Oracle NoSQL Database??

    - by zhangqm
    ?????????Oracle?????Oracle NoSQL Database,?????NoSQL Database ??????????Oracle NoSQL Database??2???,Community Edition ?Enterprise Edition?????????NoSQL Database 11g R2 (11gR2.1.2.123). ?????????????????: Oracle NoSQL Database OTN portal (includes download facility) Oracle NoSQL Database OTN documentation Oracle NoSQL Database license information ??Oracle NoSQL Database ???????????,????,?????(key-value)???TB????,????????????(???)????,??????????????????????????,????,??????????? ?Oracle NoSQL Database?,???????????key-value???,??key???????:??????????key?(?????string),????????(??????????bytes)??????key-value ??primary key?hash?,????????????????????????????????,???????,????????????????????????? ???????????????????Java API??????Oracle NoSQL Database driver ????????,?????key-value????????????????Oracle NoSQL Database ?????Create, Read, Update and Delete (CRUD)??,???????durability??????????????????????:?????web console???command line??? Oracle Berkeley DB Java Edition Oracle NoSQL Database?? Oracle Berkeley DB Java Edition ????????,??????????????????????????????,?????????????????? ????????????Oracle NoSQL Database Driver?????key-value????????????Oracle NoSQL Database Driver??:?????????hash??????????????????,?????????????????????? ????????Oracle NoSQL Database Oracle NoSQL Database????????????????????????????????????????????????????????????????: ???? ???? ???? ?????? ???? ?????? ????,??,?? ???? ???? ??? (sub-millisecond) ???????? ????? ??????? ????????  ?????Oracle?????? ???? (Oracle Big Data Appliance) ???? ?????????????????????????????????,???“??”???????????,Oracle NoSQL Database???????????Oracle NoSQL Database?????(Cloud)??,????????(TB?PB??)???Oracle NoSQL Database ??????ETL??(??MapReduce, Hadoop)??,??acquire-organize-analyze ?????????? ???????Oracle NoSQL Database?????: • Large schema-less data repositories• Web?? (click-through capture)• ????• ????• ?????????? • Sensor/statistics/network capture (?????, ?????)• ?????????• ???? (MMS, SMS, routing)• ???? Oracle NoSQL Database (Community Edition ??)??????????? Oracle Big Data Appliance???

    Read the article

  • django+uploadify - don't working

    - by Erico
    Hi, I'm trying to use an example posted on the "github" the link is http://github.com/tstone/django-uploadify. And I'm having trouble getting work. can you help me? I followed step by step, but does not work. Accessing the "URL" / upload / the only thing is that returns "True" part of settings.py import os PROJECT_ROOT_PATH = os.path.dirname(os.path.abspath(file)) MEDIA_ROOT = os.path.join(PROJECT_ROOT_PATH, 'media') TEMPLATE_DIRS = ( os.path.join(PROJECT_ROOT_PATH, 'templates')) urls.py from django.conf.urls.defaults import * from django.conf import settings from teste.uploadify.views import * from django.contrib import admin admin.autodiscover() urlpatterns = patterns('', (r'^admin/', include(admin.site.urls)), url(r'upload/$', upload, name='uploadify_upload'), ) views.py from django.http import HttpResponse import django.dispatch upload_received = django.dispatch.Signal(providing_args=['data']) def upload(request, *args, **kwargs): if request.method == 'POST': if request.FILES: upload_received.send(sender='uploadify', data=request.FILES['Filedata']) return HttpResponse('True') models.py from django.db import models def upload_received_handler(sender, data, **kwargs): if file: new_media = Media.objects.create( file = data, new_upload = True, ) new_media.save() upload_received.connect(upload_received_handler, dispatch_uid='uploadify.media.upload_received') class Media(models.Model): file = models.FileField(upload_to='images/upload/', null=True, blank=True) new_upload = models.BooleanField() uploadify_tags.py from django import template from teste import settings register = template.Library() @register.inclusion_tag('uploadify/multi_file_upload.html', takes_context=True) def multi_file_upload(context, upload_complete_url): """ * filesUploaded - The total number of files uploaded * errors - The total number of errors while uploading * allBytesLoaded - The total number of bytes uploaded * speed - The average speed of all uploaded files """ return { 'upload_complete_url' : upload_complete_url, 'uploadify_path' : settings.UPLOADIFY_PATH, # checar essa linha 'upload_path' : settings.UPLOADIFY_UPLOAD_PATH, } template - uploadify/multi_file_upload.html {% load uploadify_tags }{ multi_file_upload '/media/images/upload/' %} <script type="text/javascript" src="{{ MEDIA_URL }}js/swfobject.js"></script> <script type="text/javascript" src="{{ MEDIA_URL }}js/jquery.uploadify.js"></script> <div id="uploadify" class="multi-file-upload"><input id="fileInput" name="fileInput" type="file" /></div> <script type="text/javascript">// <![CDATA[ $(document).ready(function() { $('#fileInput').uploadify({ 'uploader' : '/media/swf/uploadify.swf', 'script' : '{% url uploadify_upload %}', 'cancelImg' : '/media/images/uploadify-remove.png/', 'auto' : true, 'folder' : '/media/images/upload/', 'multi' : true, 'onAllComplete' : allComplete }); }); function allComplete(event, data) { $('#uploadify').load('{{ upload_complete_url }}', { 'filesUploaded' : data.filesUploaded, 'errorCount' : data.errors, 'allBytesLoaded' : data.allBytesLoaded, 'speed' : data.speed }); // raise custom event $('#uploadify') .trigger('allUploadsComplete', data); } // ]]</script>

    Read the article

  • django: can't adapt error when importing data from postgres database

    - by Oleg Tarasenko
    Hi, I'm having strange error with installing fixture from dumped data. I am using psycopg2, and django1.1.1 silver:probsbox oleg$ python manage.py loaddata /Users/oleg/probs.json Installing json fixture '/Users/oleg/probs' from '/Users/oleg/probs'. Problem installing fixture '/Users/oleg/probs.json': Traceback (most recent call last): File "/opt/local/lib/python2.5/site-packages/django/core/management/commands/loaddata.py", line 153, in handle obj.save() File "/opt/local/lib/python2.5/site-packages/django/core/serializers/base.py", line 163, in save models.Model.save_base(self.object, raw=True) File "/opt/local/lib/python2.5/site-packages/django/db/models/base.py", line 495, in save_base result = manager._insert(values, return_id=update_pk) File "/opt/local/lib/python2.5/site-packages/django/db/models/manager.py", line 177, in _insert return insert_query(self.model, values, **kwargs) File "/opt/local/lib/python2.5/site-packages/django/db/models/query.py", line 1087, in insert_query return query.execute_sql(return_id) File "/opt/local/lib/python2.5/site-packages/django/db/models/sql/subqueries.py", line 320, in execute_sql cursor = super(InsertQuery, self).execute_sql(None) File "/opt/local/lib/python2.5/site-packages/django/db/models/sql/query.py", line 2369, in execute_sql cursor.execute(sql, params) File "/opt/local/lib/python2.5/site-packages/django/db/backends/util.py", line 19, in execute return self.cursor.execute(sql, params) ProgrammingError: can't adapt First I've checked similar issues on internet. This one seemed to be very related: http://code.djangoproject.com/ticket/5996, as my data has many non ASCII symbols But actually I've checked my django installation and it's ok there Could you advice what is wrong

    Read the article

  • django+mod_wsgi on virtualenv not working

    - by jwesonga
    I've just finished setting up a django app on virtualenv, deployment went smoothly using a fabric script, but now the .wsgi is not working, I've tried every variation on the internet but no luck. My .wsgi file is: import os import sys import django.core.handlers.wsgi # put the Django project on sys.path root_path = os.path.abspath(os.path.dirname(__file__) + '../') sys.path.insert(0, os.path.join(root_path, 'kcdf')) sys.path.insert(0, root_path) os.environ['DJANGO_SETTINGS_MODULE'] = 'kcdf.settings' application = django.core.handlers.wsgi.WSGIHandler() I keep getting the same error: [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] mod_wsgi (pid=16938): Exception occurred processing WSGI script '/home/kcdfweb/webapps/kcdf.web/releases/current/kcdf/apache/kcdf.wsgi'. [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] Traceback (most recent call last): [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] File "/usr/local/lib/python2.6/dist-packages/django/core/handlers/wsgi.py", line 230, in __call__ [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] self.load_middleware() [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] File "/usr/local/lib/python2.6/dist-packages/django/core/handlers/base.py", line 33, in load_middleware [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] for middleware_path in settings.MIDDLEWARE_CLASSES: [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] File "/usr/local/lib/python2.6/dist-packages/django/utils/functional.py", line 269, in __getattr__ [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] self._setup() [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] File "/usr/local/lib/python2.6/dist-packages/django/conf/__init__.py", line 40, in _setup [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] self._wrapped = Settings(settings_module) [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] File "/usr/local/lib/python2.6/dist-packages/django/conf/__init__.py", line 75, in __init__ [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] raise ImportError, "Could not import settings '%s' (Is it on sys.path? Does it have syntax errors?): %s" % (self.SETTINGS_MODULE, e) [Sun Apr 18 12:44:30 2010] [error] [client 41.215.123.159] ImportError: Could not import settings 'kcdf.settings' (Is it on sys.path? Does it have syntax errors?): No module named kcdf.settings my virtual environment is on /home/user/webapps/kcdfweb my app is /home/user/webapps/kcdf.web/releases/current/project_name my wsgi file home/user/webapps/kcdf.web/releases/current/project_name/apache/project_name.wsgi

    Read the article

  • Trying to get django app to work with mod_wsgi on CentOS 5

    - by David
    I'm running CentOS 5, and am trying to get a django application working with mod_wsgi. I'm using .wsgi settings I got working on Ubuntu. Here is the error: [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] SystemError: dynamic module not initialized properly [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] mod_wsgi (pid=23630): Target WSGI script '/data/hosting/cubedev/apache/django.wsgi' cannot be loaded as Python module. [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] mod_wsgi (pid=23630): Exception occurred processing WSGI script '/data/hosting/cubedev/apache/django.wsgi'. [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] Traceback (most recent call last): [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] File "/data/hosting/cubedev/apache/django.wsgi", line 8, in [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] import django.core.handlers.wsgi [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] File "/opt/python2.6/lib/python2.6/site-packages/django/core/handlers/wsgi.py", line 1, in [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] from threading import Lock [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] File "/opt/python2.6/lib/python2.6/threading.py", line 13, in [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] from functools import wraps [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] File "/opt/python2.6/lib/python2.6/functools.py", line 10, in [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] from _functools import partial, reduce [Thu Mar 04 10:52:15 2010] [error] [client 10.1.0.251] SystemError: dynamic module not initialized properly And here is my .wsgi file import os import sys os.environ['PYTHON_EGG_CACHE'] = '/tmp/django/' os.environ['DJANGO_SETTINGS_MODULE'] = 'cube.settings' sys.path.append('/data/hosting/cubedev') import django.core.handlers.wsgi application = django.core.handlers.wsgi.WSGIHandler()

    Read the article

  • Error in django using Apache & mod_wsgi

    - by Ignacio
    Hey, I've been doing some changes to my django develpment env, as some of you suggested. So far I've managed to configure and run it successfully with postgres. Now I'm trying to run the app using apache2 and mod_wsgi, but I ran into this little problem after I followed the guidelines from the django docs. When I access localhost/myapp/tasks this error raises: Request Method: GET Request URL: http://localhost/myapp/tasks/ Exception Type: TemplateSyntaxError Exception Value: Caught an exception while rendering: argument 1 must be a string or unicode object Original Traceback (most recent call last): File "/usr/local/lib/python2.6/dist-packages/django/template/debug.py", line 71, in render_node result = node.render(context) File "/usr/local/lib/python2.6/dist-packages/django/template/defaulttags.py", line 126, in render len_values = len(values) File "/usr/local/lib/python2.6/dist-packages/django/db/models/query.py", line 81, in __len__ self._result_cache = list(self.iterator()) File "/usr/local/lib/python2.6/dist-packages/django/db/models/query.py", line 238, in iterator for row in self.query.results_iter(): File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/query.py", line 287, in results_iter for rows in self.execute_sql(MULTI): File "/usr/local/lib/python2.6/dist-packages/django/db/models/sql/query.py", line 2369, in execute_sql cursor.execute(sql, params) File "/usr/local/lib/python2.6/dist-packages/django/db/backends/util.py", line 19, in execute return self.cursor.execute(sql, params) TypeError: argument 1 must be a string or unicode object ... ... ... And then it highlights a {% for t in tasks %} template tag, like the source of the problem is there, but it worked fine on the built-in server. The view associated with that page is really simple, just fetch all Task objects. And the template just displays them on a table. Also, some pages get rendered ok. Don't want to fill this Question with code, so if you need some more info I'd be glad to provide it. Thanks

    Read the article

  • SQL SERVER – World Shapefile Download and Upload to Database – Spatial Database

    - by pinaldave
    During my recent, training I was asked by a student if I know a place where he can download spatial files for all the countries around the world, as well as if there is a way to upload shape files to a database. Here is a quick tutorial for it. VDS Technologies has all the spatial files for every location for free. You can download the spatial file from here. If you cannot find the spatial file you are looking for, please leave a comment here, and I will send you the necessary details. Unzip the file to a folder and it will have the following content. Then, download Shape2SQL tool from SharpGIS. This is one of the best tools available to convert shapefiles to SQL tables. Afterwards, run the .exe file. When the file is run for the first time, it will ask for the database properties. Provide your database details. Select the appropriate shape files and the tool will fill up the essential details automatically. If you do not want to create the index on the column, uncheck the box beside it. The screenshot below is simply explains the procedure. You also have to be careful regarding your data, whether that is GEOMETRY or GEOGRAPHY. In this example,  it is GEOMETRY data. Click “Upload to Database”. It will show you the uploading process. Once the shape file is uploaded, close the application and open SQL Server Management Studio (SSMS). Run the following code in SSMS Query Editor. USE Spatial GO SELECT * FROM dbo.world GO This will show the complete map of world after you click on Spatial Results in Spatial Tab. In Spatial Results Set, the Zoom feature is available. From the Select label column, choose the country name in order to show the country name overlaying the country borders. Let me know if this tutorial is helpful enough. I am planning to write a few more posts about this later. Note: Please note that the images displayed here do not reflect the original political boundaries. These data are pretty old and can probably draw incorrect maps as well. I have personally spotted several parts of the map where some countries are located a little bit inaccurately. Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Add-On, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Spatial, SQL Tips and Tricks, SQL Utility, T SQL, Technology

    Read the article

  • The Road to Professional Database Development: Database Normalization

    Not only is the process of normalization valuable for increasing data quality and simplifying the process of modifying data, but it actually makes the database perform much faster. To prove the point, Peter Larsson takes a large unnormalised database and subjects it to successive stages of normalisation. Get smart with SQL Backup ProGet faster, smaller backups with integrated verification.Quickly and easily DBCC CHECKDB your backups. Learn more.

    Read the article

  • IntegrityError: foreign key violation upon delete

    - by Lukasz Korzybski
    I have Order and Shipment model. Shipment has a foreign key to Order. class Order(...): ... class Shipment() order = m.ForeignKey('Order') ... Now in one of my views I want do delete order object along with all related objects. So I invoke order.delete(). I have Django 1.0.4, PostgreSQL 8.4 and I use transaction middleware, so whole request is enclosed in single transaction. The problem is that upon order.delete() I get: ... File "/usr/local/lib/python2.6/dist-packages/django/db/backends/__init__.py", line 28, in _commit return self.connection.commit() IntegrityError: update or delete on table "main_order" violates foreign key constraint "main_shipment_order_id_fkey" on table "main_shipment" DETAIL: Key (id)=(45) is still referenced from table "main_shipment". I checked in connection.queries that proper queries are executed in proper order. First shipment is deleted, after that django executes delete on order row: {'time': '0.000', 'sql': 'DELETE FROM "main_shipment" WHERE "id" IN (17)'}, {'time': '0.000', 'sql': 'DELETE FROM "main_order" WHERE "id" IN (45)'} Foreign key have ON DELETE NO ACTION (default) and is initially deferred. I don't know why I get foreign key constraint violation. I also tried to register pre_delete signal and manually delete shipment objects before delete on order is called, but it resulted in the same error. I can change ON DELETE behaviour for this key in Postgres but it would be just a hack, I wonder if anyone has a better idea what's going on here. There is also a small detail, my Order model inherits from Cart model, so it actually doesn't have id field but cart_ptr_id and after DELETE on order is executed there is also DELETE on cart, but it seems unrelated? to the shipment-order problem so I simplified it in the example.

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >