Search Results

Search found 31483 results on 1260 pages for 'database migration'.

Page 612/1260 | < Previous Page | 608 609 610 611 612 613 614 615 616 617 618 619  | Next Page >

  • CodeIgniter based e-shop, shipping and gift address design problem

    - by alexander
    While building an ecommerce platform I have run into design problems. I'm working with the built-in CodeIgniter's cart class. It stores all the cart information in session. Let say that cart has already been filled with products and user clicks checkout. When should I store order in database? Just after that click or after several steps of gathering information and stoing it in session? How to deal with additional features like different shipping methods? Should I add it to the basket first and get additional (gift address) to session? I dont want to store it in database because of the relation between gift address and order is needed and since I dont know what's the ID of the order. I'm puzzled :) Additionally I think its crucial to keep cart aware of shipping methods and additional bought services (by selecting gift address there is an extra fee) because the cart content is just like an reciept? In brief, what is the best practice to process checkout?

    Read the article

  • WPF Application Slow Unresponsive when demonstrating using remote sharing software

    - by Kev
    After spending 14 hours on this I think its time to share my woes and see if anyone has experienced this issue before. Ill describe the issue and tests I have done to rule out certain things. Ok so I have a WPF application which loads in data from an SQL database. I am using DevExpress Components for datagrids, ribbons etc.. and FluentNhibernate to provide a session for database operations. I am also using log4net to log events to a textfile. Using the application on my laptop with SQL Express 2008 works fine.. the application starts up, retrieves 1000 records and I can tab through the controls on the ribbon. Now, I decided to demo the application to a third party and used remote login/sharing software online to share my desktop with the other person so as I could load the application on my laptop and they could view me using the application. Now, the application takes approx 45 seconds to load... 30 seconds with a blank database where as, when im not sharing out my screen using the online software the application loads in about 7-10 seconds. As well as that, even using the controls in the application during the demo were very sticky, slow and unresponsive. During the sharing session though however I was able to use other applications without any problems.. everything else worked fine. But I cannot understand how my application works ok under normal conditions , even browsing the net at the same time etc... BUT totally fails to perform correctly when I am sharing a session with another user... the CPU usage shot up to 100% too at times when the application was trying to start up... Please see below a list of 3rd party dlls I am using as references in my project. DevExpress dlls FluidKit PixelLab.WPF PixelLab.Common Galasoft WPF Kit FluentNHibernate NHibernate Nhibernate.ByteCode.Castle Skype4ComLib TXTEXTControl log4net LinqKit All of these DLLs are in the output folder with the application dlls created from the class assemblys in the project. So when installed via an installer on a machine the dlls will be in the same application folder as the application file itself. Many thanks

    Read the article

  • php / mysql pagination

    - by arrgggg
    Hi, I have a table with 58 records in mysql database. I was able to connect to my database and retrive all records and made 5 pages with links to view each pages using php script. webpage will look like this: name number john 1232343456 tony 9878768544 jack 3454562345 joe 1232343456 jane 2343454567 andy 2344560987 marcy 9873459876 sean 8374623534 mark 9898787675 nancy 8374650493 1 2 3 4 5 that's the first page of 58 records and those 5 numbers at bottom are links to each page that will display next 10 records. I got all that. but what I want to do is display the links in this way: 1-10 11-20 21-30 31-40 41-50 51-58 note: since i have 58 records, last link will display upto 58, instead of 60. Since I used the loop to create this link, depending on how many records i have, the link will change according to the number of records in my table. How can i do this? Thanks.

    Read the article

  • Url shortening algorithm

    - by Bozho
    Now, this is not strictly about URL shortening, but my purpose is such anyway, so let's view it like that. Of course the steps to URL shortening are: Take the full URL Generate a unique short string to be the key for the URL Store the URL and the key in a database (a key-value store would be a perfect match here) Now, about the 2nd point. Here's what I've come up with: ByteArrayOutputStream baos = new ByteArrayOutputStream(); DataOutputStream dos = new DataOutputStream(baos); UUID uuid = UUID.randomUUID(); dos.writeLong(uuid.getMostSignificantBits()); String encoded = new String(Base64.encodeBase64(baos.toByteArray()), "ISO-8859-1"); String shortUrl = StringUtils.left(6); // returns the leftmost 6 characters // check if exists in database, repeat until it does not I wonder if this is good enough. Is it?

    Read the article

  • How would I UPDATE these table entries with SQL and PHP?

    - by CT
    I am working on an Asset Database problem. I enter assets into a database. Every object is an asset and has variables within the asset table. An object is also a type of asset. In this example the type is server. Here is the Query to retrieve all necessary data: SELECT asset.id ,asset.company ,asset.location ,asset.purchaseDate ,asset.purchaseOrder ,asset.value ,asset.type ,asset.notes ,server.manufacturer ,server.model ,server.serialNumber ,server.esc ,server.warranty ,server.user ,server.prevUser ,server.cpu ,server.memory ,server.hardDrive FROM asset LEFT JOIN server ON server.id = asset.id WHERE asset.id = '$id' I then assign all results into single php variables. How would I write a query/script to update an asset?

    Read the article

  • Update BIOS on Sun Fire X4150 server

    - by Massimo
    I have some Sun Fire X4150 servers with a very old BIOS release (1ADQW015), which seems to have some compatibility problems with WMware ESX Server 3.5 and Windows 2008 R2 virtual machines; so I want to update the BIOS on them. The problem: according to this page, if your servers run ELOM (mine do), you first need to update to the latest ELOM release, then to the interim transition release, then finally you can update to the latest one. Ok, I'm willing to do that... but it looks like Sun (now Oracle) will happily let you download the latest firmware DVD (3.3.0), but it will not let you download the transition release (2.0) if you don't have a support contract. Well, I actuall don't care at all about the servers' management controllers (we don't even use them), so upgrading from ELOM to ILOM is totally irrelevant to me; but I need to update the servers' BIOS. So my question is: can I update the servers' BIOS to the latest version without doing the full ELOM-to-ILOM migration, or will this not work (or even make the servers unusable)? Do BIOS versions and SP ones need to be matched, or can one be updated without bothering with the other? Bonus question: if this whole ELOM-to-ILOM thing actually is needed in order to update the BIOS, can that 2.0 CD-ROM be obtained without having a support contract with Sun/Oracle (which we are definitely not going to sign, being that quite old hardware)? Update: I tried upgrading only the BIOS on one of the servers, and it didn't boot anymore. So it really looks like a full firmware upgrade is needed, and the management controller and BIOS versions should be kept in sync. So... where can I find that *&!£%$% 2.0 CD-ROM? Or at least the transition firmware that can be found on it?

    Read the article

  • CodeIgniter's XSS Protection is removing <script> tags from user inputs... but I don't want it to!

    - by Jack W-H
    Hey folks, CodeIgniter is brilliant but I'm using it to develop a site where users need to be able to share their code for websites. Unfortunately, CodeIgniter has been doing the "right" thing by removing <script> tags from my user's inputs into the database, so when it's returned data looks like this: [removed] User's data [removed] However, I need my site to DISPLAY script tags but obviously not PARSE them. How can I get CodeIgniter or PHP to return <script> tags, but still sanitise them for the database and return them without them executing? Thanks! Jack EDIT: By the way, it's not an option to use stuff like Markdown, everything has to output to copy-pastable code that could work with no modification somewhere else

    Read the article

  • Problem updating blog with hibernate?

    - by johnsmith51
    hi, i am having problem updating a blob with hibernate. my model have these getters/setters for hibernate, i.e. internally i deal with byte[] so any getter/setter convert the byte[] to blog. I can create an initial object without problem, but if I try to change the content of the blob, the database column is not updated. I do not get any error message, everything looks fine, except that the database is not updated. /** do not use, for hibernate only */ public Blob getLogoBinaryBlob() { if(logoBinary == null){ return null; } return Hibernate.createBlob(logoBinary); } /** do not use, for hibernate only */ public void setLogoBinaryBlob(Blob logoBinaryBlob) { ByteArrayOutputStream baos = new ByteArrayOutputStream(); try { logoBinary = toByteArrayImpl(logoBinaryBlob, baos); } catch (Exception e) { } }

    Read the article

  • Is it really wrong to version documents using CouchDB's default behaviour?

    - by Tomas Sedovic
    This is one of those "I know I shouldn't do this but it's oh so convenient." questions. Sorry about that. I plan to use CouchDB for storing a bunch of documents and keeping their entire revision history. CouchDB does the versioning automatically, but it is strongly discouraged for programmer's use: "You cannot rely on document revisions for any other purpose than concurrency control." From what I've found on the CouchDB wiki, the versions can get deleted either during compaction or during replication. As far as I can tell, Compaction must always be triggered manually and Replication occurs only when there's more than one database server. The question is: if I won't run compaction and will use only single database instance for my documents, can I just use CouchDB's document versioning and expect it to work? What other problems I might run into? E.g. does not running compaction hurt the performance or consume significantly more disk space (than if I did handle the versioning manually)?

    Read the article

  • JSON encode MySQL results then read using jQuery

    - by silentw
    I have a database table with some rows that I want to fetch using PHP and then encode them using JSON. Currently, my database structure is the following: idcomponente | quantidade After fetching the values in PHP, I want to know how can I encode them using JSON (with multiple rows, using the same names) so I can read them using jQuery.post(). $.post('test.php',{id:id},function(data){ //READ data HERE }); Thanks in advance Edit: So far, I made this: $.post('edit/producomponentes.php',{id:id},function(data){ console.log(data); }); Logs this: [Object { componente="1", quantidade="2"}, Object { componente="3", quantidade="3"}] Now how can I go through each row and fetch their properties? (data.componente, data.quantidade)

    Read the article

  • SQLite multi process access

    - by Nicolas
    Hello, We are using SQLite in a multi processes and multi threaded application. The SQLite database files are encrypted using the embedded SQLite encryption. The FAQ states that SQLite should be able to manage multi process accesses using locks mechanism. We are experiencing a strange problem: When many threads are accessing the same database file, sometime constrains violations occur, more specifically - a field with a unique constrain is getting duplicate values after calling "insert or replace" statement. It happens quite often now, that we are using the encryption. Before we started using SQLite encryption we did not notice such a behavior. Are there any specific known issues with this?

    Read the article

  • Adding users to Sharepoint when they are not in the same domain

    - by jim-work
    Bear with me as I explain this, I'm working my way through Sharepoint access as I go, but I'll clarify my question as I go along. The Problem We have about 10,000 users who need access to our Sharepoint 2005 based reporting. Because our organization is migrating from one domain to another, we need to add each user twice, once for each domain. For the current domain, this is no problem, we've got a powershell script that I tweaked to add all the users in a given CSV file, this takes about 5 minutes to run. The big problem we're having is with users who are NOT in our currently active domain. Because the sharepoint server cannot authenticate the new users, we can't add them directly. What we're doing is creating a temp user, then using STSADM.EXE to migrate that test user to the proper domain/user_name for each of our 10,000 users. The creation and migration takes about 5 seconds per user, or well over 12 hours to run. The Question Has anyone encountered this before? Is there a way to add users without requiring AD authentication? Why is STSADM.EXE running so slow? Thanks a lot for any advice or direction anyone can give me.

    Read the article

  • What are the best tools for Sql Server version control

    - by Mendy
    After reading this post, and the suggestion to use Team Edition for Database Professionals, I want to know is there any equivalent to this for SQL server 2008 / Visual stuio 2010 ultimate. I'm looking for tool need to do all the thing that Jeff mention in his article: Create test data. Schema comparison. Data comparison. Database unit testing. Refactoring. Integrated T-SQL editor, a first class language construct in the IDE, just like C# and VB.NET.

    Read the article

  • Is it possible to use WIndows Speech Recognition Engine in a word pronunciation game?

    - by XBasic3000
    I use to create an application that uses the windows speech recognition engine or the SAPI. its like a game for pronunciation that it give you score when you pronounce it correctly. but when i started experiments with SAPI, it has poor recognition unless if you load a grammar on it (XML) its give best recognition result. but the problem now is closest pronunciation from the input text will be recognize. for example: Database - dedebase - correct. even if you mispronounce it. it gives you correct answers. without using the xml grammar when you say database it give you "in the base/the base/data base/etc..." please post your answer,suggestion,clarification. votes for best answer. is it possible or not? by the way i use delphi compiler on the projects....

    Read the article

  • How build my own Application Setting

    - by adisembiring
    I want to build a ApplicationSetting for my application. The ApplicationSetting can be stored in a properties file or in a database table. The settings are stored in key-value pairs. E.g. ftp.host = blade ftp.username = dummy ftp.pass = pass content.row_pagination = 20 content.title = How to train your dragon. I have designed it as follows: Application settings reader: interface IApplicationSettingReader { read(); } DatabaseApplicationSettingReader { dao appSettingDao; AppSettings read() { List<AppSettingEntity> listEntity = appSettingsDao.findAll(); Map<String, String> map = new HaspMap<String, String>(); foreach (AppSettingEntity entity : listEntity) { map.put(entity.getConfigName(), entity.getConfigValue()); } return new AppSettings(map); } } DatabaseApplicationSettingReader { dao appSettingDao; AppSettings read() { //read from some properties file return new AppSettings(map); } } Application settings class: AppSettings { private static AppSettings instance; private Map map; Public AppSettings(Map map) { this.map = map; } public static AppSettings getInstance() { if (instance == null) { throw new RuntimeException("Object not configure yet"); } return instance; } public static configure(IApplicationSettingReader reader) { instance = reader.read(); } public String getFtpSetting(String param) { return map.get("ftp." + param); } public String getContentSetting(String param) { return map.get("content." + param); } } Test class: AppSettingsTest { IApplicationSettingReader reader; @Before public void setUp() throws Exception { reader = new DatabaseApplicationSettingReader(); } @Test public void getContentSetting_should_get_content_title() { AppSettings.configure(reader); Instance settings = AppSettings.getInstance(); String title = settings.getContentSetting("title"); assertNotNull(title); Sysout(title); } } My questions are: Can you give your opinion about my code, is there something wrong? I configure my application setting once, while the application start, I configure the application setting with appropriate reader (DbReader or PropertiesReader), I make it singleton. The problem is, when some user edit the database or file directly to database or file, I can't get the changed values. Now, I want to implement something like ApplicationSettingChangeListener. So if the data changes, I will refresh my application settings. Do you have any suggestions how this can be implemented?

    Read the article

  • Customized User Registration Form

    - by Nitz
    Hey Guys, i have made user-register.tpl.php file. And i have set many text field in that. But now i need that.... i want to store the users information to the database. bcz i have created the customized registration page, so i need that my text field values should be store in the database. like this....... Username: <input type="text" name="myuser" id="myuser" /> Now i want to store the username, which will entered in this myuser text filed. NitishPanchjanya Corporation

    Read the article

  • regenerate the ID row in a MySQL Table with a Mysql's Script or PHP

    - by DomingoSL
    Hello, i have a database fill with information of the users who use my webpage. The table as many MySql tables have the ID parameters who is autoincrement. The issue is that when somebody eliminate his account from the site, in the database remain a jump in the sequence that i dont want cuz i have a script who fail if find some jump in the ID. Ex. ID Name PASS 1 Jhon 1234 2 Max 2233 3 Jorge 2232 If Max get out and a new user go in, this is what will happend. ID Name PASS 1 Jhon 1234 4 NewU 1133 3 Jorge 2232 So what is the best way to erase some body from the data base in order to avoid this isuue, or if is not a way, its posible to do a PHP or MySql script who eliminate all the contents in the ID row and regenerate it in order? Thanks A lot! sorry for my english

    Read the article

  • Echo autoincrement id doubt

    - by Marcelo
    Hi, can I print the id, even if it's autoincrement ? Because the way I'm doing I'm using an empty variable for id. $id= ""; mysql_connect(localhost,$username,$password); @mysql_select_db($database) or die ("Não conectou com a base $database"); mysql_query("INSERT INTO table1(id,...) VALUES ('".$id."',....)") or die(mysql_error()); mysql_close(); echo "Your id is :"; echo "".$id; I'm trying to print the id, but it's coming blank. I checked the table and there's an id number there. How can I print it then at? Thanks for the attention

    Read the article

  • Get list of named queries in NHibernate

    - by Dan
    I have a dozen or so named queries in my NHibernate project and I want to execute them against a test database in unit tests to make sure the syntax still matches the changing domain/database model. Currently I have a unit test for each named query where I get and execute the query, for example: IQuery query = session.GetNamedQuery("GetPersonSummaries"); var personSummaryArray = query.List(); Assert.That(personSummaryArray, Is.Not.Null); This works fine, but I would like to have one unit test that loops thru all of the named queries and executes them. Is there a way to discover all of the available named queries? Thanks Dan

    Read the article

  • DAS vs SAN storage for serving 2 to 4 nodes

    - by Luke404
    We currently have 4 Linux nodes with local storage, arranged in two active/passive pairs with storage mirrored using DRBD, running virtual machines (actually using Xen Hypervisor) for typical hosting workloads (mail, web, a couple VPS, etc.). We're approaching the (presumed) maximum IOPS of those servers, and we're planning to migrate to an external storage solution with two active nodes, with capacity for up to four active nodes. Since we're an all-Dell shop I've done some research and found the MD3200 / MD3200i products should be the ones we're looking for. We are pretty sure we won't be attaching more than 4 hosts on a single storage and I'm wondering if there is any clear advantage for one or the other. In theory I should be able to attach 4 SAS hosts to a single MD3200 (single links on a single controller MD3200, or dual redundant SAS links from each host to a dual-controller MD3200), or 4 iSCSI hosts to a single MD3200i (directly on its 4 GigE ports without any switch, again with dual links for the dual controller option). Both setups should let us implement live VM migration since all hosts can access all the LUNs at the same time, and also some shared filesystem like GFS2 or OCFS2. Also, both setups should allow full redundancy of the whole system (assuming dual controllers in the storage). One difference I can see is that the DAS solution is actually limited to 4 hosts while the iSCSI one should be able to grow to more hosts (adding two GigE switches to the mix). One point for the iSCSI solution is that it would allow us to start out with our current nodes and upgrade them at a later time (we can't add other SAS controllers, but they already have 4 GigE ports each). With the right (iSCSI|SAS) controllers I should be able to connect diskless nodes and boot them off the external storage which I think is a good thing (get rid of any local storage). On the other hand, I would have thought the SAS one to be cheaper but it seems like an MD3200 actually costs a little less than an MD3200i (?) (please note: I've used Dell gear in my examples since that's what we're looking for but I assume the same goes with other vendors) I would like to know if my assumptions above are correct, and if I'm missing any important difference between the two setups.

    Read the article

  • Extending configuration for .Net 3.5 Applications

    - by Maximiliano Rios
    Due to a requirement in my current project, I have to build a configuration manager to handle configurations that merge local config info with database one. Custom configuration doesn't fit my needs, problem is that I don't know what's the type before loading certain information, for example: Loading database information I will able to know what's myhandler's type. Not previously. So I thought to write my own handler but I can't let set blank as type for sections, in fact .net requires to know what's the type to match myhandler nodes. I'm thinking on building a different parser to read XML nodes but I would prefer to match this structure. I've not found any information to do that yet, is there any way? Can I extend or hook up something into the framework to be capable of loading on-the-fly types and validate nodes? Thanks in advance.

    Read the article

  • What are the Pros & Cons of using SQL Azure for existing apps on dedicated servers

    - by Mark Redman
    We currently own our own servers, and rent a rack in a datacentre. Looking at the pricing, scalabilty and SLAs for Azure SQL, I am thinking that it might be viable to only use Azure SQL but continue to use our existing applications on our own servers in a datacentres. This will enable us to not worry about the database and its infrastructure so we can concentrate on building an application server farm with disk storeage for files etc. Our application is quite big and has various windows services and parts of it used unmanaged libraries that may not be feasible in the cloud, so probably coulnt have everything in the Azure cloud. The pros: Reduced Total Cost of ownership (no database servers, no sql server licenses) The Cons: I guess there would be overhead in the transfer of data between the Azure Cloud and our datacentre (ie cloud may be in US and datacentre is in the UK) but would this overhead be usable?

    Read the article

  • How to transfer SQLite db to web server on android phone

    - by Shane
    Hi, I have an application that creates an SQLite database and saves information to it over the course of a day. At the end of the day i want to export this database to a web server. Could anyone point me in the right direction for this? Should I use httppost or put. I have researched this myself online but there seems to be so many different ways to explore. The server side does not exist yet either. I have access to an apache server so i am hoping to use that. Could anyone advise me the best/most simple way to do this? Thanks

    Read the article

< Previous Page | 608 609 610 611 612 613 614 615 616 617 618 619  | Next Page >