Search Results

Search found 193 results on 8 pages for 'marko carter'.

Page 7/8 | < Previous Page | 3 4 5 6 7 8  | Next Page >

  • Which workaround to use for the following SQL deadlock?

    - by Marko
    I found a SQL deadlock scenario in my application during concurrency. I belive that the two statements that cause the deadlock are (note - I'm using LINQ2SQL and DataContext.ExecuteCommand(), that's where this.studioId.ToString() comes into play): exec sp_executesql N'INSERT INTO HQ.dbo.SynchronizingRows ([StudioId], [UpdatedRowId]) SELECT @p0, [t0].[Id] FROM [dbo].[UpdatedRows] AS [t0] WHERE NOT (EXISTS( SELECT NULL AS [EMPTY] FROM [dbo].[ReceivedUpdatedRows] AS [t1] WHERE ([t1].[StudioId] = @p0) AND ([t1].[UpdatedRowId] = [t0].[Id]) ))',N'@p0 uniqueidentifier',@p0='" + this.studioId.ToString() + "'; and exec sp_executesql N'INSERT INTO HQ.dbo.ReceivedUpdatedRows ([UpdatedRowId], [StudioId], [ReceiveDateTime]) SELECT [t0].[UpdatedRowId], @p0, GETDATE() FROM [dbo].[SynchronizingRows] AS [t0] WHERE ([t0].[StudioId] = @p0)',N'@p0 uniqueidentifier',@p0='" + this.studioId.ToString() + "'; The basic logic of my (client-server) application is this: Every time someone inserts or updates a row on the server side, I also insert a row into the table UpdatedRows, specifying the RowId of the modified row. When a client tries to synchronize data, it first copies all of the rows in the UpdatedRows table, that don't contain a reference row for the specific client in the table ReceivedUpdatedRows, to the table SynchronizingRows (the first statement taking part in the deadlock). Afterwards, during the synchronization I look for modified rows via lookup of the SynchronizingRows table. This step is required, otherwise if someone inserts new rows or modifies rows on the server side during synchronization I will miss them and won't get them during the next synchronization (explanation scenario to long to write here...). Once synchronization is complete, I insert rows to the ReceivedUpdatedRows table specifying that this client has received the UpdatedRows contained in the SynchronizingRows table (the second statement taking part in the deadlock). Finally I delete all rows from the SynchronizingRows table that belong to the current client. The way I see it, the deadlock is occuring on tables SynchronizingRows (abbreviation SR) and ReceivedUpdatedRows (abbreviation RUR) during steps 2 and 3 (one client is in step 2 and is inserting into SR and selecting from RUR; while another client is in step 3 inserting into RUR and selecting from SR). I googled a bit about SQL deadlocks and came to a conclusion that I have three options. Inorder to make a decision I need more input about each option/workaround: Workaround 1: The first advice given on the web about SQL deadlocks - restructure tables/queries so that deadlocks don't happen in the first place. Only problem with this is that with my IQ I don't see a way to do the synchronization logic any differently. If someone wishes to dwelve deeper into my current synchronization logic, how and why it is set up the way it is, I'll post a link for the explanation. Perhaps, with the help of someone smarter than me, it's possible to create a logic that is deadlock free. Workaround 2: The second most common advice seems to be the use of WITH(NOLOCK) hint. The problem with this is that NOLOCK might miss or duplicate some rows. Duplication is not a problem, but missing rows is catastrophic! Another option is the WITH(READPAST) hint. On the face of it, this seems to be a perfect solution. I really don't care about rows that other clients are inserting/modifying, because each row belongs only to a specific client, so I may very well skip locked rows. But the MSDN documentaion makes me a bit worried - "When READPAST is specified, both row-level and page-level locks are skipped". As I said, row-level locks would not be a problem, but page-level locks may very well be, since a page might contain rows that belong to multiple clients (including the current one). While there are lots of blog posts specifically mentioning that NOLOCK might miss rows, there seems to be none about READPAST (never) missing rows. This makes me skeptical and nervous to implement it, since there is no easy way to test it (implementing would be a piece of cake, just pop WITH(READPAST) into both statements SELECT clause and job done). Can someone confirm whether the READPAST hint can miss rows? Workaround 3: The final option is to use ALLOW_SNAPSHOT_ISOLATION and READ_COMMITED_SNAPSHOT. This would seem to be the only option to work 100% - at least I can't find any information that would contradict with it. But it is a little bit trickier to setup (I don't care much about the performance hit), because I'm using LINQ. Off the top of my head I probably need to manually open a SQL connection and pass it to the LINQ2SQL DataContext, etc... I haven't looked into the specifics very deeply. Mostly I would prefer option 2 if somone could only reassure me that READPAST will never miss rows concerning the current client (as I said before, each client has and only ever deals with it's own set of rows). Otherwise I'll likely have to implement option 3, since option 1 is probably impossible... I'll post the table definitions for the three tables as well, just in case: CREATE TABLE [dbo].[UpdatedRows]( [Id] [uniqueidentifier] NOT NULL ROWGUIDCOL DEFAULT NEWSEQUENTIALID() PRIMARY KEY CLUSTERED, [RowId] [uniqueidentifier] NOT NULL, [UpdateDateTime] [datetime] NOT NULL, ) ON [PRIMARY] GO CREATE NONCLUSTERED INDEX IX_RowId ON dbo.UpdatedRows ([RowId] ASC) WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] GO CREATE TABLE [dbo].[ReceivedUpdatedRows]( [Id] [uniqueidentifier] NOT NULL ROWGUIDCOL DEFAULT NEWSEQUENTIALID() PRIMARY KEY NONCLUSTERED, [UpdatedRowId] [uniqueidentifier] NOT NULL REFERENCES [dbo].[UpdatedRows] ([Id]), [StudioId] [uniqueidentifier] NOT NULL REFERENCES, [ReceiveDateTime] [datetime] NOT NULL, ) ON [PRIMARY] GO CREATE CLUSTERED INDEX IX_Studios ON dbo.ReceivedUpdatedRows ([StudioId] ASC) WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] GO CREATE TABLE [dbo].[SynchronizingRows]( [StudioId] [uniqueidentifier] NOT NULL [UpdatedRowId] [uniqueidentifier] NOT NULL REFERENCES [dbo].[UpdatedRows] ([Id]) PRIMARY KEY CLUSTERED ([StudioId], [UpdatedRowId]) ) ON [PRIMARY] GO PS! Studio = Client. PS2! I just noticed that the index definitions have ALLOW_PAGE_LOCK=ON. If I would turn it off, would that make any difference to READPAST? Are there any negative downsides for turning it off?

    Read the article

  • jQuery animation loop not working

    - by Marko Ivanovski
    Hi, I'm trying to create a looping animation that starts on onmousedown and stops on onmouseout. The effect is a simple scroll that will continue looping until you release the mouse. I've created a function which performs the .animate method and it passes itself as a callback but the code only runs once. Here's the entire code: $(document).ready(function() { var $scroller = $("#scroller"); var $content = $("#content", $scroller); // lineHeight equal to 1 line of text var lineHeight = $content.css('line-height'); //Amount to scroll = 3 lines of text a time var amountToScroll = lineHeight.replace("px","")*3; var maxScroll = $content.height() - $scroller.height(); function scrollDown() { var topCoord = $content.css("top").replace("px",""); if(topCoord > -maxScroll) { if((-maxScroll-topCoord) > -amountToScroll) { $content.stop().animate({ top: -maxScroll }, 1000 ); } else { $content.stop().animate({ top: "-=" + amountToScroll }, 1000, function(){ scrollDown() } ); } } } function scrollUp() { var topCoord = $content.css("top").replace("px",""); if(topCoord < 0) { if(topCoord > -amountToScroll) { $content.stop().animate({ top: 0 }, 1000 ); } else { $content.stop().animate({ top: "+=" + amountToScroll }, 1000, function(){scrollUp()} ); } } } $("#scroll-down").mousedown(function() { scrollDown(); }); $("#scroll-down").mouseup(function() { $content.stop(); }); $("#scroll-up").mousedown(function() { scrollUp(); }); $("scroll-up").mouseup(function() { $content.stop(); }); });

    Read the article

  • Domain validates but won't save

    - by marko
    I have the following setup. Class, say, Car that has a CarPart (belongsTo=[car:Car]). When I'm creating a Car I also want to create som default CarParts, so I do def car = new Car(bla bla bla) def part = new CarPart(car:car) Now, when I do car.validate() or part.validate() it seems fine. But when I do if(car.save && part.save() I get this exception: 2012-03-24 14:02:21,943 [http-8080-4] ERROR util.JDBCExceptionReporter - Batch entry 0 insert into car_part (version, car_id, id) values ('0', '297', '298') was aborted. Call getNextException to see the cause. 2012-03-24 14:02:21,943 [http-8080-4] ERROR util.JDBCExceptionReporter - ERROR: value too long for type character varying(6) 2012-03-24 14:02:21,943 [http-8080-4] ERROR events.PatchedDefaultFlushEventListener - Could not synchronize database state with session org.hibernate.exception.DataException: Could not execute JDBC batch update Stacktrace follows: java.sql.BatchUpdateException: Batch entry 0 insert into car_part (version, deal_id, id) values ('0', '297', '298') was aborted. Call getNextException to see the cause. at org.postgresql.jdbc2.AbstractJdbc2Statement$BatchResultHandler.handleError(AbstractJdbc2Statement.java:2621) at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1837) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:407) at org.postgresql.jdbc2.AbstractJdbc2Statement.executeBatch(AbstractJdbc2Statement.java:2754) at $Proxy20.flush(Unknown Source) at ristretto.DealController$_closure5.doCall(DealController.groovy:109) at ristretto.DealController$_closure5.doCall(DealController.groovy) at java.lang.Thread.run(Thread.java:722) Any ideas?

    Read the article

  • Unwanted automated creation of new instances of an activity class

    - by Marko
    I have an activity (called Sender) with the most basic UI, only a button that sends a message when clicked. In the onClickListener I only call this method: private void sendSMS(String msg) { PendingIntent pi = PendingIntent.getActivity(this, 0, new Intent(this, Sender.class), 0); PendingIntent pi = PendingIntent.getActivity(this, 0, myIntent, 0); SmsManager sms = SmsManager.getDefault(); sms.sendTextMessage("1477", null, msg, pi, null); } This works ok, the message is sent but every time a message is sent a new instance of Sender is started on top of the other. If I call sendSMS method three times, three new instances are started. I'm quite new to android so I need some help with this, I only want the same Sender to be on all the time

    Read the article

  • How to call in C# function from Win32 DLL with custom objects

    - by marko
    How to use in C# function from Win32 DLL file made in Delphi. When function parameters are custom delphi objects? Function definition in Delphi: function GetAttrbControls( Code : PChar; InputList: TItemList; var Values : TValArray): Boolean; stdcall; export; Types that use: type TItem = packed record Code : PChar; ItemValue: Variant; end; TItemList = array of TItem; TValArray = array of PChar; Example in C# (doesn't work): [StructLayout(LayoutKind.Sequential)] public class Input { public string Code; public object ItemValue; }; [DllImport("Filename.dll", EntryPoint = "GetValues", CharSet = CharSet.Ansi, ExactSpelling = true, CallingConvention = CallingConvention.Cdecl)] public static extern bool GetValues(string Code, Input[] InputList, ref StringBuilder[] Values);

    Read the article

  • Serving large generated files using Google App Engine?

    - by John Carter
    Hiya, Presently I have a GAE app that does some offline processing (backs up a user's data), and generates a file that's somewhere in the neighbourhood of 10 - 100 MB. I'm not sure of the best way to serve this file to the user. The two options I'm considering are: Adding some code to the offline processing code that 'spoofs' it as a form upload to the blob store, and going thru the normal blobstore process to serve the file. Having the offline processing code store the file somewhere off of GAE, and serving it from there. Is there a much better approach I'm overlooking? I'm guessing this is functionality that isn't well suited to GAE. I had thought of storing in the datastore as db.Text or Dd.Blob but there I encounter the 1 MB limit. Any input would be appreciated,

    Read the article

  • Partition a rectangle into near-squares of given areas

    - by Marko Dumic
    I have a set of N positive numbers, and a rectangle of dimensions X and Y that I need to partition it in N smaller rectangles such that: the surface area of each smaller rectangle is proportional to it's corresponding number in given set all space of big rectangle is occupied and there is no leftover space between smaller rectangles each small rectangle should be shaped as close to square as feasible the execution time should be reasonably small I need directions on this. Do you know of such algorithm described on the web? Do you have any ideas (pseudo-code is fine)? Thanks.

    Read the article

  • How to manually add Core Data to project?

    - by marko
    E.g. what "Use Core Data for storage" checkbox option means when creating new Window or Navigation based project? How to add Core Data for Tab Bar Application? How to initialize managedObjectModel, managedObjectContext, persistentStoreCoordinator?

    Read the article

  • Data 'logger' turning off when phone goes to standbymode

    - by Marko Järvenpää
    I'm creating a datalogger which logs the gps data and the sensor data of the phone. I've just a strange problem. If the phone is not touched for few minutes it goes into standby mode (screen goes black), and that causes the logger to stop working. Actually the firewriting in the logger stops working. The GPS resumes fine after coming out of blackscreen, but when I check the log I created it only shows saved points for few minutes. Does anyone have idea what is causing this?

    Read the article

  • How to read binary column in database into image on asp.net page?

    - by marko
    I want to read from database where I've stored a image in binary field and display a image. while (reader.Read()) { byte[] imgarr = (byte[])reader["file"]; Stream s = new MemoryStream(imgarr); System.Drawing.Image image = System.Drawing.Image.FromStream(s); Graphics g = Graphics.FromImage(image); g.DrawImage(image, new Point(400, 10)); image.Save(Response.OutputStream, ImageFormat.Jpeg); g.Dispose(); image.Dispose(); } con.Close(); This piece of code doesn't work: System.Drawing.Image image = System.Drawing.Image.FromStream(s); What am I doing wrong?

    Read the article

  • how to format jsf 2.0 <f:selectItems/> with date value from List

    - by Marko
    Hi, Im using jsf 2.0 to develop app where user has to select (using radio button) a date from the list of possible choices. List of dates is a managed bean property of type List< java.util.Date. Im using facelets <h:selectOneRadio value="#{banner_backing.selectedInterval}" border="1" layout="pageDirection"> <f:selectItems value="#{banner_backing.avaliableIntervals}" var="interval"> </f:selectItems> </h:selectOneRadio> to display radio buttons. Here is my question: how to format selectItems label and value in a patter other then default (Fri May 28 00:00:00 CEST 2010), like 'HH:mm:ss dd/MM/yyyy'?

    Read the article

  • How to programmatically register a component that depends on a list of already registered components

    - by Chris Carter
    I'm programmatically registering a group of services that all implement the same interface, IRule. I have another service that looks like this: public class MyService { private IEnumerable<IRule> _rules; public MyService(IEnumerable<IRule> rules){ _rules = rules; } } Hammett posted something that looked like what I wanted, http://hammett.castleproject.org/?p=257. I changed the signature to IRule[] and tried the ArrayResolver trick in the post but that didn't work for me(note, it didn't break anything either). Anyone know how to programmatically register a component like the code I posted above?

    Read the article

  • Need new method for linking to native mapping from mobile web app

    - by Carter
    My mobile web apps use a map button which automatically starts the mapping features of Android and iPhone by simply linking to http://maps.google.com/maps?q=New+York. iOs 6 comes out, the links stop working, because Apple wants us to use "maps.APPLE.com". Turns out ANYTHING you send to "maps.apple.com" gets forwarded to "maps.google.com". So now I have to specially detect iOs 6 and swap out links just so Apple can forward everything back to Google anyway. Is there a clean way to open the device/native mapping app from a mobile web app that works on Android, iOs 6, and iOs pre-6, since iOs 6 nerfed it? Recently updated documentation on Apple dev site... http://developer.apple.com/library/ios/#featuredarticles/iPhoneURLScheme_Reference/Articles/MapLinks.html#//apple_ref/doc/uid/TP40007894-SW1 Both these links go to the same place http://maps.google.com/maps?q=New+York http://maps.apple.com/maps?q=New+York

    Read the article

  • How to delete a post in Nhibernate?

    - by marko
    I try to delete a post in NHibernate but nothing happens. Updating, selecting and inserting new items works fine but when I try to delete nothing happens. IQuery query = session.CreateQuery("from Color where name like '%" + TextBox2.Text.Trim() + "%'"); Color color = query.List<Color>()[0]; session.Delete(color); Edit: I forgot to call the flush method. Now works fine. Like this: session.Flush();

    Read the article

  • Activity won't start a service

    - by Marko Cakic
    I m trying to start an IntentService from the main activity of y application and it won't start. I have the service in the manifest file. Here's the code: MainActivity public class Home extends Activity { private LinearLayout kontejner; IntentFilter intentFilter; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_home); kontejner = (LinearLayout) findViewById(R.id.kontejner); intentFilter = new IntentFilter(); startService(new Intent(getBaseContext(), HomeService.class)); } } Service: public class HomeService extends IntentService { public HomeService() { super("HomeService"); // TODO Auto-generated constructor stub } @Override protected void onHandleIntent(Intent intent) { Toast.makeText(getBaseContext(), "TEST", Toast.LENGTH_LONG).show(); } } Manifest: <manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.example.salefinder" android:versionCode="1" android:versionName="1.0" > <uses-sdk android:minSdkVersion="8" android:targetSdkVersion="15" /> <application android:icon="@drawable/ic_launcher" android:label="@string/app_name" android:theme="@style/AppTheme" > <activity android:name=".Home" android:label="@string/title_activity_home" > <intent-filter> <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> </activity> <service android:name=".HomeService" /> </application> <uses-permission android:name="android.permission.INTERNET"/> </manifest> How can I make it work?

    Read the article

  • Parameter pack aware std::is_base_of()

    - by T. Carter
    Is there a possibility to have a static assertion whether a type provided as template argument implements all of the types listed in the parameter pack ie. a parameter pack aware std::is_base_of()? template <typename Type, typename... Requirements> class CommonBase { static_assert(is_base_of<Requirements..., Type>::value, "Invalid."); ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ parameter pack aware version of std::is_base_of() public: template <typename T> T* as() { static_assert(std::is_base_of<Requirements..., T>::value, "Invalid."); return reinterpret_cast<T*>(this); } };

    Read the article

  • how can I speed up insertion of many rows to a table via ADO.NET?

    - by jcollum
    I have a table that has 5 columns: AcctId (int), Address1 (varchar), Address2 (varchar), Person1 (varchar), Person2 (varchar) . I'm generating random data to insert into this table via a C# console application. I've tried doing this random data insert via SQL-Server and decided it was not a good solution -- SQL is not good at random on an each-row basis. Generating the random data -- 975k rows of it -- takes a minimal amount of time. It's in a List of custom objects. I need to take this random data and update many rows in the database with the new random data. I tried updating the rows one at a time, very slow because of the repeated searching of the List object in code. So I think the best approach is to put all the randomized data into a table in the database, then update all the other tables that use this data. I.e. UPDATE t SET t.Address1=d.Address1 FROM Table1 t INNER JOIN RandomizedData d ON d.AcctId = t.Acct_ID. The database is very un-normalized so this Acct data is sprinkled all over the place. I've got no control of the normalization. So, having decided to insert all of the randomized data into a single table, I set out to create insert scripts: USE TheDatabase Insert tmp_RandomizedData SELECT 1,'4392 EIGHTH AVE','','JENNIFER CARTER','BARBARA CARTER' UNION ALL SELECT 2,'2168 MAIN ST','HNGR F','DANIEL HERNANDEZ','SUSAN MARTIN' // etc another 98 times... // FYI, this is not real data! I'm building this INSERT script in batches of 100. It's taking on average 175 ms to run each insert. Does this seem like a long time? It's going to take about 35 mins to run the whole insert. The table doesn't have a primary key or any indexes. I was planning on adding those after all the data in inserted (thinking that that would be faster). Is there a better way to do this?

    Read the article

  • how to join tables sql server

    - by Rick
    Im having some trouble with joining two tables. This is what my two tables look like: Table 1 Customer_ID CustomerName Add. 1000 John Smith 1001 Mike Coles 1002 Sam Carter Table 2 Sensor_ID Location Temp CustIDFK 1000 NY 70 1002 NY 70 1000 ... ... 1001 1001 1002 Desired: Sensor_ID Location Temp CustIDFK 1000 NY 70 John Smith 1002 NY 70 Sam Carter 1000 ... ... John Smith 1001 Mike Coles 1001 1002 I have made Customer_ID from table 1 my primary key, created custIDFK in table 2 and set that as my foreign key. I am really new to sql server so I am still having trouble with the whole relationship piece of it. My goal is to match one customer_ID with one Sensor_ID. The problem is that the table 2 does not have "unique IDs" since they repeat so I cant set that to my foreign key. I know I will have to do either an inner join or outer join, I just dont know how to link the sensor id with customer one. I was thinking of giving my sensor_ID a unique ID but the data that is being inserted into table 2 is coming from another program. Any suggestions?

    Read the article

  • Nokia promet une tablette sous Windows 8 et un smartphone "révolutionnaire" qui n'aura plus besoin d'être touché ou regardé

    Nokia promet une tablette sous Windows 8 et un smartphone "révolutionnaire" "Que les utilisateurs n'auront plus besoin de regarder ou de toucher" Le magazine spécialisé DigiTimes rapportait dans l'un de ses articles paru le 12 mars dernier que Nokia sous-traiterait actuellement la production d'une tablette de 10 pouces à une société chinoise, Compal Electronics. Selon ce même article, 200 000 unités seraient conçues dans un premier temps pour une commercialisation prévue au quatrième trimestre de cette année. Une nouvelle que Marko Ahtisaari, vice-président senior de la section Design de Nokia, a confirmé dans un entretien accordé au magazine finlandais Kavppalehti O...

    Read the article

  • How does IE8 handle xml header.

    - by markovuksanovic
    I was wondering where I can find some information how IE8 actually handles xml header... for example how is handling <?xml version="1.0" encoding="utf-8"?> different to <?xml version="1.0"?>. One other questions would be how FF handles those header. How is that different to IE8. I am almost 100% sure that they handle them differently but am still doing some research. /Marko

    Read the article

< Previous Page | 3 4 5 6 7 8  | Next Page >