Search Results

Search found 60391 results on 2416 pages for 'data generation'.

Page 241/2416 | < Previous Page | 237 238 239 240 241 242 243 244 245 246 247 248  | Next Page >

  • Replication - syncronizing most of the data some of the time

    - by uncle brad
    I have some data that isn't properly "partitioned" (for lack of a better word). All inserts, processing and reporting happen on the same table. The bulk of the processing happens not long after the insert and not long after that it becomes immutable (we're talking days). I could do all inserts and processing on a new table that I replicate to the old table. When I detect that the data has become immutable I would delete the data from the new table, but I would edit the delete replication stored procedure so that the delete did not replicate. How bad an idea is this? It seems attractive at the moment (I haven't slept on it yet) because it might mitigate a performance problem with only very small changes to the application. It also seems like it might be a good way to shoot myself in the foot.

    Read the article

  • jQuery Autocomplete fetch and parse data source with custom function, except not

    - by Ben Dauphinee
    So, I am working with the jQuery Autocomplete function, and am trying to write a custom data parser. Not sure what I am doing incorrectly, but it throws an error on trying to call the autocompleteSourceParse function, saying that req is not set. setURL("ajax/clients/ac"); function autocompleteSourceParse(req, add){ var suggestions = []; $.getJSON(getURL()+"/"+req, function(data){ $.each(data, function(i, val){ suggestions.push(val.name); }); add(suggestions); }); return(suggestions); } $("#company").autocomplete({ source: autocompleteSourceParse(req, add), minLength: 2 });

    Read the article

  • Data-separation in a Symfony Multi-tenant app using Doctrine

    - by Prasad
    I am trying to implement a multi-tenant application, that is - data of all clients in a single database - each shared table has a tenant_id field to separate data I wish to achieve data separation by adding where('tenant_id = ', $user->getTenantID()) {pseudoc-code} to all SELECT queries I could not find any solution up-front, but here are possible approaches I am considering. 1) crude approach: customizing all fetchAll and fetchOne functions in every class (I will go mad!) 2) using listeners: possibly coding for the preDqlSelect event and adding the 'where' to all queries 3) override buildQuery(): could not find an example of this for front-end 4) implement contentformfilter: again need a pointer Would appreciate if someone could validate these & comment on efficieny, suitability. Also, if anyone has achieved multitenancy using another strategy, pl share. Thanks

    Read the article

  • Delete Range of Data From Text File With PHP

    - by Evan Byrne
    I want to delete a range of data from a text file using PHP. Let's assume the file contains the following: Hello, World! I want to delete everything from character 2 to character 7. The actual file I need to do this with is very large, so I don't want to have to read the large file in order to delete just a small, given range of data. The data contained within the given range is not known, so str_replace or preg_replace solutions wouldn't work anyways. Thanks!

    Read the article

  • Fastest way to compress a database or .bak file and transfer it

    - by Nai
    As per the question title. I wonder if there are special programmes or commands that makes zipping up a .bak file and transferring it super quick. I read abour xp_cmdshell here but I'm not sure about the speed. My .bak file is about 12 gigs at the moment. Related to this is the possibility of using Red Gate's SQL Data Compare to just transfer the differential data across the network pipeline but I have never used SQL Data Compare before and I'm not sure how it goes about doing INSERTS on tables with Primary Keys and such. Also, not sure about the speed. Does anyone have any experience with this programme or similar programmes? Cheers!

    Read the article

  • load table data on jsp using struts2 with fixed table structure

    - by Zahra
    Hi. I want to have a fixed table structure on my jsp page(3row, 4column). but I want to load data for this table from DataBase with using struts 2. I know if my table structure wasn't fixed I could have just get a List and iterate on it and add a data in every iteration, but how could I do this in this case. also if I don't have enough data to fill table, I want those places to be empty. I didn't find a good example, if you could help me or introduce me a good tutorial, it would be really appreciated. Thanks in advance.

    Read the article

  • How to merge data from two separate access 2007 databases

    - by DiegoMaK
    Hi, I have two identical databases with same structure, database a in computer a and database b in computer b. The data of database a*(a.accdb)* and database b*(b.accdb)* are different. then in database a i have for example ID:1, 2, 3 and in database B i Have ID:4,5,6 Then i need merge these databases data in only one database(a or b, doesn't matter) so the final database looks like. ID:1,2,3,4,5,6 I search an easy way to do this. because i have many tables. and do this by union query is so tedious. I search for example for a backup option for only data without scheme as in postgreSQl or many others RDBMS, but i don't see this options in access 2007. pd:only just table could be have duplicate values(I guess that pk doesn't allow copy a duplicate value and all others values will be copied well). if i wrong please correct me. thanks for your help.

    Read the article

  • Most useful techniques for two way data binding with js

    - by Perpetualcoder
    With abundance of web services and client side templating features of jQuery and likes, creating mashups or sites consuming a multitude of web services and posting data back to these services is becoming exceedingly popular. For a page of decent size with this kind of architecture, say a dashboard. What are the useful techniques of maintaining this client side state. In other words whats are some of the ways to do two way databinding? Sample scenario: Get Data From Service as JSON/XML Display/Bind on UI Capture User Input Recreate request from the UI controls/html Post Data To Service Get Response and Rebind

    Read the article

  • Comparing two java objects on fly (data type not known)

    - by Narendra
    Hi All, I need to compare different data objects. Can any one tell me how can i do this. I don't know what are the data types i will get priorly. If i need to use any util from apache commons then please give reference to it. At present I am using .equals() for comparing equality of objects .It is working fine when I am comparing quality for two strings. If i am comparing java.sql.date data type then it is showing unequal even though both contains same values. Can any one suggest me on this regard. Thanks, Narendra

    Read the article

  • What format do I use to store a relatively small amount of user data

    - by wcm
    I am writing a small program for our local high school (pro bono). The program has an interface allows the user to enter school holidays. This is a simple stand alone Windows app. What format should I use to store the data? A big relational data is obviously overkill. My initial plan was to store the data in an XML file. Co-workers have been suggesting that I use JSON files, Access Databases, SQL Lite, and SQL Server Express. There was even a suggestion of old school INI files.

    Read the article

  • Performance - User defined query / filter to search data

    - by Cagatay Kalan
    What is the best way to design a system where users can create their own criterias to search data ? By "design" i mean, data storage, data access layer and search structure. We will actually refactor an existing application which is written in C# and ASP .NET and we don't want to change the infrastructure. Our main issue is performance and we use MSSQL and DevExpress to build queries. Some queries run in 4-5 minutes and all the columns included in the queries have indexes. When i check queries, i see that DevExpress builds too many "exists" clauses and i'm not happy with that because i have doubts that some of these queries skip some indexes. What may be the alternatives to DevExpress? NHibernate or Entity Framework? Can we build dynamic criteria system and store these to database in both of them ? And also do we need any alternative storage like a lucene index or OLAP database?

    Read the article

  • HTML 5 Canvas - Get pixel data for a path

    - by Mikey S.
    I wonder if is there any way I can get pixel data for the currently drawn path in the canvas tag. I can calculate the pixel data on my own when drawing simple shapes like square or a line, but things get messy with more complicated shapes like ellipse or even a simple circle. The reason i'm asking this is because I'm working on a web application which involves sending canvas pixels data to the server when I add a path to the canvas. The server needs to keep it's own copy of the entire canvas, and I really don't want to send the ENTIRE canvas image every single change, but only the delta for efficiency reasons... Thanks.

    Read the article

  • Macro to copy data from one sheet to another based on the current date

    - by SgtSnafu
    Does anyone have a macro that copy data from one sheet to another based on the current date? I am working with a single workbook of three sheets. Sheet one will hold the manual input of daily production figures for multiple plants, sheet two is to hold ongoing daily data, keyed on sheet one. The macro will be associated with a button, so that once clicked it would search for every row that has a date of today, and copy that row to the next available blank row on sheet two. Sample Data... Plant 1 Input Date - $ Produced - Labor Hour 3-29-10 - 4538 - 8 3-30-10 - 7862 - 12 3-31-10 4-1-10 4-2-10 Plant 2 Input Date - $ Produced - Labor Hour 3-29-10 - 4545 - 9 3-30-10 - 7645 - 12 3-31-10 4-1-10 4-2-10

    Read the article

  • asp.net disappearing data from Dropdownlist inside listview when loading from DataTable

    - by piotrmichal
    Hello I have a problem with dropdowllist ddl1 that is inside of insertitem of listview. I populate it from table on page load. In Page_load I have: if (!Page.IsPostBack) { ListView1.DataBind(); InsDropDownList3.DataSource = GetAllPlayersName(); InsDropDownList3.DataBind(); } DropDownList ddl1 = (DropDownList)ListView1.InsertItem.FindControl("InsDropDownList1"); if (ddl1 != null) { DataTable t = GetAllPlayersName(); ddl1.DataSource = t; ddl1.DataBind(); } When I select a row in listview or edit it data from dropdownlist in insertitem row disappears - list became empty until I close page and open again. There is another InsDropDownList3 outside of listview and there is no problem with this one. Why one is loosing data and the other one not? As I understand ddl1 should even get data every time page loads.

    Read the article

  • get length of data sent over network to TCPlistener/networkstream vb.net

    - by Jonathan.
    It seems the most obvious thing, but I just can't work out how to get the length of bytes sent over a network using a TCPClient and TCPListener? This is my code so far: 'Must listen on correct port- must be same as port client wants to connect on. Const portNumber As Integer = 9999 Dim tcpListener As New TcpListener(IPAddress.Parse("192.168.2.7"), portNumber) tcpListener.Start() Console.WriteLine("Waiting for connection...") 'Accept the pending client connection and return 'a TcpClient initialized for communication. Dim tcpClient As TcpClient = tcpListener.AcceptTcpClient() Console.WriteLine("Connection accepted.") ' Get the stream Dim networkStream As NetworkStream = tcpClient.GetStream() '' Read the stream into a byte array I need to get the length of the networkstream to set the size of the array of bytes I'm going to read the data into. But the networkStream.length is unsupported and does not work and throws an Notsupportedexception. The only other way I can think of is to send the size of the data before sending the data, but this seems the long way round.

    Read the article

  • Exporting data from a gridview to different excel worksheets

    - by Alex
    I am binding data from a dataset to a grid and exporting data from the grid to an excel.if the the number of items in the grid is greater than 50000,an error message is displayed. So i want to split the data and display it in different worksheets in excel.(Am working in a web application) using this code for exporting to excel gvExcel.DataSource = DTS; gvExcel.DataBind(); Response.AddHeader("content-disposition", "attachment; filename= filename.xls"); Response.ContentType = "application/excel"; StringWriter sw = new StringWriter(); HtmlTextWriter htw = new HtmlTextWriter(sw); gvExcel.RenderControl(htw); // Style is added dynamically Response.Write(style); Response.Write(sw.ToString()); Response.End(); Can anyone help me on this??

    Read the article

  • Is SELECT INTO able to affect data from its original table during UPDATE

    - by driveby
    Whilst asking this question asp.net scheduling timed events user murph posted some insightful information: Point about this is that its very, very simple - you have an process for exchange that is performing a clearly defined task and you have a high frequency task that is not doing anything particularly complex, its a straightforward query (select from table where sent = false and send at < value) - probably into temporary table so that you can run a single query update after you've done the sends - that you can optimise the index for. You're not trying to queue up a huge pile of event triggers, just one that fires once a minute and processes things that are due. Is it possible to SELECT data from table X INTO table Y and have the UPDATES that are performed on table Y pushed into table X? I guess the alternative would be that the data gets updated in table Y then an update command can be run on table X based on the data in table Y. What would be the advantage of selecting into another table? Thank you,

    Read the article

  • Google analytics-style custom report builder UI

    - by gregmac
    I'm looking for a reporting engine/UI that can be integrated into a product, which has a UI along the lines of Google Analytics' Custom Reports builder. Is anyone aware of such a thing? The data is in our case is not page views/visitors/etc, but is similar in nature, in that there are limited entities or types of data, but each entity has many attributes/columns and many different ways of aggregating data (or in GA-style speak, metrics and dimensions). The analytics-style UI is very intuitive and allows many reports to be created in powerful ways, without having to know SQL. I have preference for a web-based tool (seeing that it is 2010 and this is a web app -- I mention only because it seems the vast majority of reporting tools still have only a non-web-based creation tool).

    Read the article

  • python sending incomplete data over socket

    - by tipu
    I have this socket server script, import SocketServer import shelve import zlib class MyTCPHandler(SocketServer.BaseRequestHandler): def handle(self): self.words = shelve.open('/home/tipu/Dropbox/dev/workspace/search/words.db', 'r'); self.tweets = shelve.open('/home/tipu/Dropbox/dev/workspace/search/tweets.db', 'r'); param = self.request.recv(1024).strip() try: result = str(self.words[param]) except KeyError: result = "set()" self.request.send(str(result)) if __name__ == "__main__": HOST, PORT = "localhost", 50007 SocketServer.TCPServer.allow_reuse_address = True server = SocketServer.TCPServer((HOST, PORT), MyTCPHandler) server.serve_forever() And this receiver, from django.http import HttpResponse from django.template import Context, loader import shelve import zlib import socket def index(req, param = ''): HOST = 'localhost' PORT = 50007 s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) s.connect((HOST, PORT)) s.send(param) data = zlib.decompress(s.recv(131072)) s.close() print 'Received', repr(data) t = loader.get_template('index.html') c = Context({ 'foo' : data }) return HttpResponse(t.render(c)) I am sending strings to the receiver that are in the hundreds of kilobytes. I end up only receiving a portion of it. Is there a way that I can fix that so that the whole string is sent?

    Read the article

  • Error loading SQL_VARIANT data type using Python

    - by Brett D
    I am using Python and SQLAlchemy to query a database that I did not create. I have run into a problem querying a table that contains the SQL_VARIANT data type. I get the error: sqlalchemy.exc.DBAPIError: (Error) ('ODBC data type -150 is not supported. Cannot read column Value.', 'HY000') I confirmed with the database creator that the "Value" column is of type SQL_VARIANT. Does anyone know a way to load this data type using Python? I am currently using mssql with pyodbc. Thank you for any help you can offer! Versions: Python 2.7, SQLAlchemy 0.7.8

    Read the article

  • Failed to save data at the server from memcached program

    - by zahir hussain
    hi i want to know why i cant store multi dimensional(array size is more than 1000) $memcache = new Memcache; $memcache->connect('localhost', 11211) or die ("Could not connect"); the above s working correctly... the below one have error... $memcache->set('key', $sear, false, 60) or die ("Failed to save data at the server"); if the $sear is string or object array then no problem for store data at the server.. but i store multi dimensional array in memcached,,i will get the error is Failed to save data at the server thanks and advance

    Read the article

  • getting web page data as json object?

    - by encryptor
    I have a url, the data of which page i need as a json object. I ve tried xmlhttprequest and ajaxobject both but doesnt work. It doesnt even give a responseText when I give it as an alert Ill post both the code snippets here. url = http://mydomain.com:port/a/b/c AJAX : var ajaxRequest = new ajaxObject(URL); ajaxRequest.callback = function (responseText,responseStatus) { alert(responseStatus); JSONData = responseText.parseJSON(); processData(JSONData); } USING xmlhttprequest: var client = new XMLHttpRequest(); client.open('GET',URL,true ); data = JSON.parse(client.responseText); alert(data.links.length); can someone please help me out with this. I understand cross scripting may be an issue, but how to come over it? and shouldn't then too it should give the alerts as zero or null

    Read the article

  • Data Access from single table in sql server 2005 is too slow

    - by Muhammad Kashif Nadeem
    Following is the script of table. Accessing data from this table is too slow. SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[Emails]( [id] [int] IDENTITY(1,1) NOT NULL, [datecreated] [datetime] NULL CONSTRAINT [DF_Emails_datecreated] DEFAULT (getdate()), [UID] [nvarchar](250) COLLATE Latin1_General_CI_AS NULL, [From] [nvarchar](100) COLLATE Latin1_General_CI_AS NULL, [To] [nvarchar](100) COLLATE Latin1_General_CI_AS NULL, [Subject] [nvarchar](max) COLLATE Latin1_General_CI_AS NULL, [Body] [nvarchar](max) COLLATE Latin1_General_CI_AS NULL, [HTML] [nvarchar](max) COLLATE Latin1_General_CI_AS NULL, [AttachmentCount] [int] NULL, [Dated] [datetime] NULL ) ON [PRIMARY] Following query takes 50 seconds to fetch data. select id, datecreated, UID, [From], [To], Subject, AttachmentCount, Dated from emails If I include Body and Html in select then time is event worse. indexes are on: id unique clustered From Non unique non clustered To Non unique non clustered Tabls has currently 180000+ records. There might be 100,000 records each month so this will become more slow as time will pass. Does splitting data into two table will solve the problem? What other indexes should be there?

    Read the article

  • How to scale an image (in data URI format) in JavaScript (real scaling, not using styling)

    - by 103067513055141045393
    We are capturing a visible tab in a Chrome browser (by using the extensions API chrome.tabs.captureVisibleTab) and receiving a snapshot in the data URI scheme (Base64 encoded string). Is there a JavaScript library that can be used to scale down an image to a certain size? Currently we are styling it via CSS, but have to pay performance penalties as pictures are mostly 100 times bigger than required. Additional concern is also the load on the localStorage we use to save our snapshots. Does anyone know of a way to process this data URI scheme formatted pictures and reduce their size by scaling them down? References: Data URI scheme on http://en.wikipedia.org/wiki/Data_URI_scheme Chrome Extensions API onhttp://code.google.com/chrome/extensions/tabs.html The "Recently Closed Tabs" Chrome Extension onhttp://code.google.com/p/recently-closed-tabs

    Read the article

  • Visual Studio 2010 and Silverlight - Adding Data Sources

    - by Villager
    Hello, I am interested in building a Silverlight application that uses RIA Services. I am using this video (http://live.visitmix.com/MIX10/Sessions/CL08) as an example. In that video, the presenter uses the "data sources" tab to populate the view. However, I cannot figure out how to add a data source from within Visual Studio 2010. I have a database on my local machine. This database is the sample AdventureWorks database. When I select my Silverlight application, there is a UserRegistrationContext in the data sources window. However, I cannot figure out how to add a new one that connects to my AdventureWorks database. Can somebody tell me how to do this? Thank you!

    Read the article

< Previous Page | 237 238 239 240 241 242 243 244 245 246 247 248  | Next Page >