Search Results

Search found 6392 results on 256 pages for 'reduce duplicate'.

Page 77/256 | < Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >

  • How to speed up marching cubes?

    - by Dan Vinton
    I'm using this marching cube algorithm to draw 3D isosurfaces (ported into C#, outputting MeshGeomtry3Ds, but otherwise the same). The resulting surfaces look great, but are taking a long time to calculate. Are there any ways to speed up marching cubes? The most obvious one is to simply reduce the spatial sampling rate, but this reduces the quality of the resulting mesh. I'd like to avoid this. I'm considering a two-pass system, where the first pass samples space much more coarsely, eliminating volumes where the field strength is well below my isolevel. Is this wise? What are the pitfalls? Edit: the code has been profiled, and the bulk of CPU time is split between the marching cubes routine itself and the field strength calculation for each grid cell corner. The field calculations are beyond my control, so speeding up the cubes routine is my only option... I'm still drawn to the idea of trying to eliminate dead space, since this would reduce the number of calls to both systems considerably.

    Read the article

  • priority queue with limited space: looking for a good algorithm

    - by SigTerm
    This is not a homework. I'm using a small "priority queue" (implemented as array at the moment) for storing last N items with smallest value. This is a bit slow - O(N) item insertion time. Current implementation keeps track of largest item in array and discards any items that wouldn't fit into array, but I still would like to reduce number of operations further. looking for a priority queue algorithm that matches following requirements: queue can be implemented as array, which has fixed size and _cannot_ grow. Dynamic memory allocation during any queue operation is strictly forbidden. Anything that doesn't fit into array is discarded, but queue keeps all smallest elements ever encountered. O(log(N)) insertion time (i.e. adding element into queue should take up to O(log(N))). (optional) O(1) access for *largest* item in queue (queue stores *smallest* items, so the largest item will be discarded first and I'll need them to reduce number of operations) Easy to implement/understand. Ideally - something similar to binary search - once you understand it, you remember it forever. Elements need not to be sorted in any way. I just need to keep N smallest value ever encountered. When I'll need them, I'll access all of them at once. So technically it doesn't have to be a queue, I just need N last smallest values to be stored. I initially thought about using binary heaps (they can be easily implemented via arrays), but apparently they don't behave well when array can't grow anymore. Linked lists and arrays will require extra time for moving things around. stl priority queue grows and uses dynamic allocation (I may be wrong about it, though). So, any other ideas?

    Read the article

  • ASP.NET NamingContainer naming convention

    - by EOLeary
    The Background Hello! I'm working on a project in which the client has required a lot of things to happen on a single page, and this has resulted in a rather large blob of HTML being rendered out to the client browser. The main issue is with input tags (where runat="server" attribute is set), these tend to cause a drastic increase in markup size due to validation, updatepanel triggers, viewstate, and the control markup itself. I've done what I can to reduce the amount of triggers I'm using, I'm compressing the viewstate (to something like 8% of the original viewstate size), I've gotten rid of a lot of ASP.NET Validators and rolled my own, and and I've been using ClientIdMode to reduce the length of the ID attributes of many asp.net elements. All of these combined significantly reduces the amount of HTML being sent to the client, (for example going from 2 megabytes for a request down to 500-600 kb - these are HUGE pages, mind you). The Issue One area which I've been having trouble reducing is simply the auto-generated 'name' attribute of input elements. <input name="ctl00$ctl00$ctl00$_main$_main$_bodyMatterPhase$_phaseTree$ctl00$_taskTree$ctl00$_taskDetails$_detailList$ctrl0$_row$_descriptionText" type="text" value="Investigation Week 1" maxlength="100" id="_taskTree_0__taskDetails_0__detailList_0__row_0__descriptionText_0" style="width:170px;"> As you can see above, the name attribute is 139 out of 297 characters, that's almost 50% of the tag markup taken up by that HUGE name. Does anyone have any ideas on how to stick a hook in somewhere in ASP.NET where I can somehow translate these or generate them differently; say instead of ctl00$ctl00$ctl00$_main$_main$_bodyMatterPhase$_phaseTree$ctl00$_taskTree$ctl00$_taskDetails$_detailList$ctrl0$_row$_descriptionText, it could be a GUID like 0x0AEED4B6445A11E08F873606E0D72085, which is 105 characters shorter. Any help would be greatly appreciated!

    Read the article

  • Should I remove all inheritance from my model in order to work with ria services?

    - by TimothyP
    I've posted some questions on this before, but it's different. So consider a small portion of our model: Person Customer Employee Spouse Person is the base class which has 3 classes that inherit from it. These 4 are very central in our design and link to many other entities. I could solve all the problems I'm experiencing with ria-services by removing the inheritance but that would really increase the complexety of the model. The first problem I experienced was that I couldn't query for Customers, Employees or Spouses, but someone gave me a solution, which was to add something like this to the DomainService: public IQueryable<Employee> GetEmployees() { return this.ObjectContext.People.OfType<Employee>(); } public IQueryable<Customer> GetCustomers() { return this.ObjectContext.People.OfType<Customer>(); } public IQueryable<Spouse> GetSpouses() { return this.ObjectContext.People.OfType<Spouse>(); } Next I tried something that seemed very normal to me: var employee = new Employee() { //.... left out to reduce the length of this question }; var spouse = new Spouse() { //.... left out to reduce the length of this questions }; employee.Spouse = spouse; context.People.Add(spouse); context.People.Add(employee); context.SubmitChanges(); Then I get the following exception: Cannot retrieve an entity set for the derived entity type 'Spouse'. Use EntityContainer.GetEntitySet(Type) to get the entity set for the base entity type 'Person'. Even when the spouse is already in the database, and I retreive it first I get similar exceptions. Also note that for some reason in some places "Persons" is used instead of "People"... So how do I solve this problem, what am I doing wrong and will I keep running into walls when using ria services with inheritance? I found some references on the web, all saying it works and then some DomainService code in which they suposedly changed something but no details... I'm using VS2010 RC1 + Silveright 4

    Read the article

  • Generate and merge data with python multiprocessing

    - by Bobby
    I have a list of starting data. I want to apply a function to the starting data that creates a few pieces of new data for each element in the starting data. Some pieces of the new data are the same and I want to remove them. The sequential version is essentially: def create_new_data_for(datum): """make a list of new data from some old datum""" return [datum.modified_copy(k) for k in datum.k_list] data = [some list of data] #some data to start with #generate a list of new data from the old data, we'll reduce it next newdata = [] for d in data: newdata.extend(create_new_data_for(d)) #now reduce the data under ".matches(other)" reduced = [] for d in newdata: for seen in reduced: if d.matches(seen): break #so we haven't seen anything like d yet seen.append(d) #now reduced is finished and is what we want! I want to speed this up with multiprocessing. I was thinking that I could use a multiprocessing.Queue for the generation. Each process would just put the stuff it creates on, and when the processes are reducing the data, they can just get the data from the Queue. But I'm not sure how to have the different process loop over reduced and modify it without any race conditions or other issues. What is the best way to do this safely? or is there a different way to accomplish this goal better?

    Read the article

  • Need help implementing this algorithm with map Hadoop MapReduce

    - by Julia
    Hi all! i have algorithm that will go through a large data set read some text files and search for specific terms in those lines. I have it implemented in Java, but I didnt want to post code so that it doesnt look i am searching for someone to implement it for me, but it is true i really need a lot of help!!! This was not planned for my project, but data set turned out to be huge, so teacher told me I have to do it like this. EDIT(i did not clarified i previos version)The data set I have is on a Hadoop cluster, and I should make its MapReduce implementation I was reading about MapReduce and thaught that i first do the standard implementation and then it will be more/less easier to do it with mapreduce. But didnt happen, since algorithm is quite stupid and nothing special, and map reduce...i cant wrap my mind around it. So here is shortly pseudo code of my algorithm LIST termList (there is method that creates this list from lucene index) FOLDER topFolder INPUT topFolder IF it is folder and not empty list files (there are 30 sub folders inside) FOR EACH sub folder GET file "CheckedFile.txt" analyze(CheckedFile) ENDFOR END IF Method ANALYZE(CheckedFile) read CheckedFile WHILE CheckedFile has next line GET line FOR(loops through termList) GET third word from line IF third word = term from list append whole line to string buffer ENDIF ENDFOR END WHILE OUTPUT string buffer to file Also, as you can see, each time when "analyze" is called, new file has to be created, i understood that map reduce is difficult to write to many outputs??? I understand mapreduce intuition, and my example seems perfectly suited for mapreduce, but when it comes to do this, obviously I do not know enough and i am STUCK! Please please help.

    Read the article

  • Splitting a set of object into several subsets of 'similar' objects

    - by doublep
    Suppose I have a set of objects, S. There is an algorithm f that, given a set S builds certain data structure D on it: f(S) = D. If S is large and/or contains vastly different objects, D becomes large, to the point of being unusable (i.e. not fitting in allotted memory). To overcome this, I split S into several non-intersecting subsets: S = S1 + S2 + ... + Sn and build Di for each subset. Using n structures is less efficient than using one, but at least this way I can fit into memory constraints. Since size of f(S) grows faster than S itself, combined size of Di is much less than size of D. However, it is still desirable to reduce n, i.e. the number of subsets; or reduce the combined size of Di. For this, I need to split S in such a way that each Si contains "similar" objects, because then f will produce a smaller output structure if input objects are "similar enough" to each other. The problems is that while "similarity" of objects in S and size of f(S) do correlate, there is no way to compute the latter other than just evaluating f(S), and f is not quite fast. Algorithm I have currently is to iteratively add each next object from S into one of Si, so that this results in the least possible (at this stage) increase in combined Di size: for x in S: i = such i that size(f(Si + {x})) - size(f(Si)) is min Si = Si + {x} This gives practically useful results, but certainly pretty far from optimum (i.e. the minimal possible combined size). Also, this is slow. To speed up somewhat, I compute size(f(Si + {x})) - size(f(Si)) only for those i where x is "similar enough" to objects already in Si. Is there any standard approach to such kinds of problems? I know of branch and bounds algorithm family, but it cannot be applied here because it would be prohibitively slow. My guess is that it is simply not possible to compute optimal distribution of S into Si in reasonable time. But is there some common iteratively improving algorithm?

    Read the article

  • [iPhone] Play video not in full screen mode

    - by Kyo
    Hello, First, excuse my english :) I've read on apple developer website that video playback provides by the framework supports only full screen mode. I will need to develop an application where video can be played in reduce screen mode. I've see that Orange TV make something which looks like what i need to do. http://img218.imageshack.us/img218/1228/tvplayerorange.jpg The application is available on app store but you need to have a subscription to test this application. Whatever, to resume it, we can see video (tv stream video) in a reduce mode and if we click on the screen it switch to a full screen mode. So my question, what i want to do is possible (Orange TV made it) but i wonder the difficulty to make it. It seems that I have to make a video player. If it take a bunch of time, I tkink I will use Media Player Framework of iPhone even isn't the optimal solution for me. Feel free to ask me more details ;) Thank you for your answers.

    Read the article

  • CSS Sprite for images which have vertical as well as horizontal repeats

    - by Rachel
    I have four images, one of which has background repeat property in horizontal direction and three of which have background repeat in vertical direction. I have different CSS classes which currently uses this images as under: .sb_header_dropdown { background: url(images/shopping_dropdown_bg.gif) repeat-y top left; padding: 8px 3px 8px 15px; } .shopping_basket_dropdown .sb_body { background: url(images/shopping_dropdown_body_bg.png) repeat-y top left; margin: 0; padding: 5px 9px 5px 8px; position: relative; z-index: 99999; } .checkout_cart .co_header_left { background: url(images/bg.gif) repeat-x 0 -150px; overflow: hidden; padding-left: 3px; } .sb_dropdown_footer { background: url(images/shopping_dropdown_footer_bg.png) repeat-y top left; clear: both; height: 7px; font-size: 0; } So here am making 4 HTTP Request and I want to implement CSS Sprite for all 4 images such that I can reduce the number of HTTP Request from 4 to 1, also thing to keep in mind is that here we have background repeat for all 4 images, either on x-direction or on y-direction and so how should sprite be created and how it can be used in the CSS to reduce the number of HTTP request. I hope this question is clear.

    Read the article

  • Hashing 11 byte unique ID to 32 bits or less

    - by MoJo
    I am looking for a way to reduce a 11 byte unique ID to 32 bits or fewer. I am using an Atmel AVR microcontroller that has the ID number burned in at the factory, but because it has to be transmitted very often in a very low power system I want to reduce the length down to 4 bytes or fewer. The ID is guaranteed unique for every microcontroller. It is made up of data from the manufacturing process, basically the coordinates of the silicone on the wafer and the production line that was used. They look like this: 304A34393334-16-11001000 314832383431-0F-09000C00 Obviously the main danger is that by reducing these IDs they become non-unique. Unfortunately I don't have a large enough sample size to test how unique these numbers are. Having said that because there will only be tens of thousands of devices in use and there is secondary information that can be used to help identify them (such as their approximate location, known at the time of communication) collisions might not be too much of an issue if they are few and far between. Is something like MD5 suitable for this? My concern is that the data being hashed is very short, just 11 bytes. Do hash functions work reliably on such short data?

    Read the article

  • openvpn TCP/UDP slow SSH/SMB performance

    - by Petr Latal
    I have question about strange behavior of my openVPN configuration on Debian lenny. I have 2 server configs (one proto tcp-server based and one proto udp based). ISP bandwidth is 7Mbit/7Mbit. When I uses proto tcp-server my download server rate is fine around 6,4 Mbit/s, but upload rate is about 3Mbit/s. When I uses proto udp, my download server rate is around 3Mbit/s and upload rate around 6,4Mbit/s. I tried to handle the MTU, MSSFIX and cipher on/off on server and client configs to synchronize rates, but without solution. Here is TCP based SERVER config: mode server tls-server port 1194 proto tcp-server dev tap0 ifconfig 11.10.15.1 255.255.255.0 ifconfig-pool 11.10.15.2 11.10.15.20 255.255.255.0 push "route 192.168.1.0 255.255.255.0" push "dhcp-option DNS 192.168.1.200" push "route-gateway 11.10.15.1" push "dhcp-option WINS 192.168.1.200" route-up /etc/openvpn/routeup.sh duplicate-cn ca /etc/openvpn/ca.crt cert /etc/openvpn/server.crt key /etc/openvpn/server.key dh /etc/openvpn/dh2048.pem log-append /var/log/openvpn.log status /var/run/vpn.status 10 user nobody group nogroup keepalive 10 120 comp-lzo verb 3 script-security 3 plugin /usr/lib/openvpn/openvpn-auth-pam.so system-auth persist-tun persist-key mssfix cipher BF-CBC Here is UDP based SERVER config: port 1194 proto udp dev tun0 local xx.xx.xx.xx server 11.10.15.0 255.255.255.0 ca /etc/openvpn/ca.crt cert /etc/openvpn/server.crt key /etc/openvpn/server.key dh /etc/openvpn/dh2048.pem log-append /var/log/openvpn.log status /var/run/vpn.status 10 user nobody group nogroup keepalive 10 120 comp-lzo verb 3 duplicate-cn script-security 3 plugin /usr/lib/openvpn/openvpn-auth-pam.so system-auth persist-tun persist-key tun-mtu 1500 mssfix 1212 client-to-client ifconfig-pool-persist ipp.txt Here is TCP/UDP based windows CLIENT config: remote xx.xx.xx.xx --socket-flags TCP_NODELAY tls-client port 1194 proto tcp-client #proto udp dev tap #dev tun pull ca ca.crt cert latis.crt key latis.key mute 0 comp-lzo adaptive verb 3 resolv-retry infinite nobind persist-key auth-user-pass auth-nocache script-security 2 mssfix cipher BF-CBC

    Read the article

  • Can't boot flash drive on GIGABYTE motherboard

    - by Deltik
    Situation When I try to boot from my flash drive, my GIGABYTE 970A-UD3 motherboard returns this: Loading Operating System ... Boot error All other motherboards I've tried support booting from that flash drive (and a backup flash drive). The operating systems I tried on both flash drives were created with usb-creator-gtk (Ubuntu USB Startup Disk Creator). I know that the motherboard understands that there is an operating system on the flash drives because when I erase them, it complains in an ALL CAPS RAGE that there isn't an operating system, which is correct. How can I boot a flash drive that's bootable from other motherboards on this motherboard? Qualification This question is not a duplicate of this one because directly writing to the flash drive as an ISO 9660 (dd if=operating_system.iso of=/dev/sdb) still does not have the motherboard recognize the operating system. This question should be a duplicate of this one because I provide more information not provided by that poster. This forum thread has broken links and does not have a solution to my problem. Nobody knows what's going on in this forum thread.

    Read the article

  • Migrating mysql 4 to mysql 5

    - by Lennart Regebro
    This seems to me to be a common use case, so I'm surprised so little information is about it, so sorry if it's a duplicate, but I have searched. :) I'm migrating a clients website from one CMS to another, and of moving to newer faster machines all at the same time. As a part of this I'm moving a MySQL database from the old server to the new ones. The problem is that the old server runs MySQL 4 and the new MySQL 5. So when i do a mysqldump at the old site and then try to run it on the new site I get syntax errors. ERROR 1064 (42000) at line 178: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'BTREE (`id`), KEY `f_ChangedOnWeb` (`f_ChangedOnWeb`), KEY `f_AddressUpdate`' at line 56 I also tried to use an even older syntax by dumping with --compatible mysql323, but that just resulted in ERROR 1062 (23000) at line 2283: Duplicate entry '??????????' for key 2`... It seems to me this must be a reasonably common use case, yet I can't find any sort of help on this. Possibly all my Google searches just drown in irrelevant answers. Most seem to agree that mysqldump is the right answer, but noone mentions that you can get syntax errors...

    Read the article

  • Can OpenVPN invoke DHCP Client?

    - by Ency
    I have got working VPN connection through openvpn, but I would like to use also my DHCP server and not openvpn's push feature. Currently everything works fine, but I have to manually start dhcp client, eg. dhclient tap0 and I get IP and other important stuff from my DHCP, is there any directive which start DHCP Client when connection is established? There is my client's config: remote there.is.server.com float dev tap tls-client #pull port 1194 proto tcp-client persist-tun dev tap0 #ifconfig 192.168.69.201 255.255.255.0 #route-up "dhclient tap0" #dhcp-renew ifconfig 0.0.0.0 255.255.255.0 ifconfig-noexec ifconfig-nowarn ca /etc/openvpn/ca.crt cert /etc/openvpn/encyNtb_openvpn_client.crt key /etc/openvpn/encyNtb_openvpn_client.key dh /etc/openvpn/dh-openvpn.dh ping 10 ping-restart 120 comp-lzo verb 5 log-append /var/log/openvpn.log Here comes server's config: mode server tls-server dev tap0 local servers.ip.here port 1194 proto tcp-server server-bridge # Allow comunication between clients client-to-client # Allowing duplicate users per one certificate duplicate-cn # CA Certificate, VPN Server Certificate, key, DH and Revocation list ca /etc/ssl/CA/certs/ca.crt cert /etc/ssl/CA/certs/openvpn_server.crt key /etc/ssl/CA/private/openvpn_server.key dh /etc/ssl/CA/dh/dh-openvpn.dh crl-verify /etc/ssl/CA/crl.pem # When no response is recieved within 120seconds, client is disconected keepalive 10 60 persist-tun persist-key user openvpn group openvpn # Log and Connected clients file log-append /var/log/openvpn verb 3 status /var/run/openvpn/vpn.status 10 # Compression comp-lzo #Push data to client push "route-gateway 192.168.69.1" push "redirect-gateway def1"

    Read the article

  • Why would the SQL 2008 "Generate scripts..." utility generate an invalid SQL script?

    - by Deane
    I have a SQL2008 database that needs to be restored to a SQL2005 instance. I have gone through the "Generate scripts..." wizard, set it for SQL2005 compatibility, and generated a 62MB SQL script. When I run it on the SQL2005 instance, it throws all kinds of errors, and some of them are really strange in that they describe an invalid database. FK constraints are wrong. It's trying to create FKs on columns that don't exist. It's trying insert records with duplicate key errors. It's trying to create the same objects twice. Any idea how this could happen? This SQL script was generated by SQL Server Management Studio just minutes before I tried to restore it, and was not modified. Why would this generate an invalid SQL file? Doesn't it just describe the SQL2008 database, which is presumably valid since we're using it? In particular, the duplicate key insertion errors mystify me. If there's a key constraint in the SQL script, then there must be the same thing in the SQL2008 table. So how could we get rows in there that violate that key constraint?

    Read the article

  • internet-based sync software that will keep running after Windows Live Sync stops doing PC-to-PC-syncs?

    - by Warren P
    According to the wikipedia page, Microsoft Live Sync will shortly stop offering the PC-to-PC sync service. There are lots of apps to sync two PCs on the same LAN, but I want to sync two PCs that are in different cities, across the internet, traversing two different NATs, and that requires some kind of service running in the internet that both connect into. There is already a few questions about syncing folders and files, but this is not a duplicate because none of them answer this basic question: Microsoft Live Sync works better than RSYNC, or any of the linked SYNC solutions in any of the "not really duplicates" because it works even when the two PCs have NAT and firewalls between them that forbid direct connectivity, because Windows Live Sync has a free always-on internet server that all the client PCs connect into. I'm looking for a FREE (no-fees) Microsoft Live Sync work-alike PC-to-PC sync solution that works between PCs and Macs, at least, as well as between PCs, and works behind NAT and firewalls at least as well as Microsoft's solution. (Note that Microsoft's solution makes only outbound socket calls to a microsoft server, so this solution must necessarily include a server-hub component that is hosted publically on a free site and which does not require that I set up and manage and pay for my own public internet hosting site) Hint: None of the answers in the linked duplicate are equivalent (PureSync,FreeFileSync,BestSync 2010,SyncButler,Comodo BackUp,QuickShadow,Gbridge) in that none of them work for the PC to Mac situation, where firewalls and nats prevent direct connection, or else they require money to be paid. When Microsoft Live Sync / Live Mesh finally kills direct PC-to-PC mode, the limitation will be that you will have to pay for more than 25 GB of cloud service, and you can then only sync PC #1 to PC #2 if you first sync to the cloud, then down to other clients. I can currently sync 100 gb of data from one computer to another, only temporarily "moving the data" through Microsoft's data servers without using up my Skydrive storage quota.

    Read the article

  • All commands stopped working in centos 6.5

    - by Michael
    I have made a big mistake while removing some duplicate packages as it appears to be broken. yum 1036 rpm -e --nodeps glibc-2.12-1.132.el6_5.2.x86_64 1037 rpm -e --nodeps nscd-2.12-1.132.el6_5.2.x86_64 1038 rpm -e --nodeps glibc-common-2.12-1.132.el6_5.2.x86_64 1040 rpm -e --nodeps glibc-common-2.12-1.132.el6.x86_64 glibc-devel-2.12-1.132.el6.x86_64 glibc-headers-2.12-1.132.el6.x86_64 1041 rpm -e glibc.x86_64 1042 rpm -e --nodeps glibc.x86_64 The issue happened after doing 1042 step. None of commands work(including yum, rpm, ls, cp etc) and getting error /lib64/ld-linux-x86-64.so.2: bad ELF interpreter: No such file or directory I thought that installing glibc after removing all the current ones would help to resolve the duplicate package error :( Now I realised that it is used as the C library in the GNU system and most systems with the Linux kernel. It defines the "system calls" and other basic facilities such as open, malloc, printf, exit, etc. Is there any possible solutions other than reinstall? I have lost ssh access. Maybe anything can be done using rescue cd? Thanks

    Read the article

  • Recommended offline on-demand virus scanners

    - by ashh
    I have never run full anti-virus on my Windows XP systems. Instead I use various anti-malware tools to manually perform scans every few weeks. This approach, combined with Windows updates and general care about what web-sites I visit and what files I download has kept me 99% free of problems. The remaining 1% has occurred when I download files that I know may contain malware, but still decide the risk is worth it. When on 2 occasions in 10 years I did get caught doing this, I realised that being able to easily scan them would most likely have avoided getting infected. I don't need, or want, to run a "stay resident" anti-virus. Also, the online scanners such as Kaspersky etc limit uploads to small files, so these are not always useful. In summary I would like to simply be able to download a file and then manually initiate an on demand anti-virus scan, on the downloaded file only. I'm sure some/most Anti-Virus do both, however once again I don't really want to pay for or need the stay resident part. Any recommendations (commercial or free)? UPDATE: This is not an exact duplicate, nor a possible duplicate. I searched for and read other questions on anti-virus here at SuperUser and found none that answered my question. I am specifically asking about anti-virus scanners that run ON-DEMAND locally on the computer, not online scanners.

    Read the article

  • Clustered MSDTC

    - by niel
    Hi I'm setting up a SQL cluster (SQL 2008), Windows 2008 R2. I enable the network access on local dtc and then create a DTC resource in my cluster . the problem is that when i start up the resource it does nto pull through my settings to enable network access. the log shows this: MSDTC started with the following settings: Security Configuration (OFF = 0 and ON = 1): Allow Remote Administrator = 0, Network Clients = 0, Trasaction Manager Communication: Allow Inbound Transactions = 0, Allow Outbound Transactions = 0, Transaction Internet Protocol (TIP) = 0, Enable XA Transactions = 0, Enable SNA LU 6.2 Transactions = 1, MSDTC Communications Security = Mutual Authentication Required, Account = NT AUTHORITY\NetworkService, Firewall Exclusion Detected = 0 Transaction Bridge Installed = 0 Filtering Duplicate Events = 1 where when i restart the local dtc service it says this: Security Configuration (OFF = 0 and ON = 1): Allow Remote Administrator = 0, Network Clients = 1, Trasaction Manager Communication: Allow Inbound Transactions = 1, Allow Outbound Transactions = 1, Transaction Internet Protocol (TIP) = 0, Enable XA Transactions = 1, Enable SNA LU 6.2 Transactions = 1, MSDTC Communications Security = No Authentication Required, Account = NT AUTHORITY\NetworkService, Firewall Exclusion Detected = 0 Transaction Bridge Installed = 0 Filtering Duplicate Events = 1 settings on both nodes in teh cluster is the same. I have reinstalled and restarted to many times to mention. Any ideas ?

    Read the article

  • Associate email account with "Personal Folders" Outlook data file?

    - by TheLQ
    In the process of migrating email servers I've run into an interesting problem: In Outlook 2007 you have the default "Personal Folders" item. This contains the email for the account that was origionally setup with Outlook. My issue is that I have deleted the account associated with that and created an entirely new account. So now I have "Personal Folders" and "[email protected]". However I can't delete "Personal Folders". nor associate "[email protected]" with that PST file. Deleteting it in Outlook (Tools Account Settings Data Files) gave the error "The default data file cannot be removed, because it is your default delivery location. After you have selected a different default delivery location, your current file can be removed." Deleting the PST file itself (outlook.pst) made outlook demand where its default file . would be. So I selected my "[email protected]" PST file and restarted Outlook. Now "Personal Folders" is called "[email protected]", but I still have a duplicate account called this. Which is bad. Worse, my email is associated with the duplicate PST, not the default. How can I associate my email with my default PST or delete the default PST entirely? Luckily I have backu

    Read the article

  • How can I make a copy of a printer in Win7?

    - by hawbsl
    Has anyone been able to copy an existing printer in Win7? I know that we were able to do this in XP (not as a direct copy/paste, but by installing the printer twice) but in Win7 it doesn't seem to be possible. Googling the answer is hopeless because searching for "copy printer" or "duplicate printer" you get a bunch of posts about "printer copiers" or people complaining about duplicate printers getting created in the background (precisely what I'd like to be able to do) It'd be good to know how to do it in general, but if it depends on the printer type, then in our case we are trying to make a copy of an HP Laserjet. Tried installing from the CD - but the CD is too old for Win7 Tried installing via Add Printer and that seems to install the printer but it's marked with an error. Tried installing via the .exe installer from the HP site and that does result in a successful printer being installed, but it won't let you install the same printer twice (stalls on the "insert USB cable now" step - simply won't enable the greyed out "Next" button). The reason this is required is so that we can print to one to the feeder and to the tray separately.

    Read the article

  • How to find what files / directories are not copied yet?

    - by user8676
    Hi all, I found the following 'nice' situation: An archive of few disks (actually three disks) which has a bunch of photos (more or less) organized. Well, this is good. A big disk shared on a network which has a bunch of photos which has another folder structure (even if is somewhat recognizable for a human being) than the archive described above, but some of the files on this big network share are the same with the files from the archive. Well, this is bad. What we need is to move the different (new) files from the network share in the archive (perhaps we'll use for this a new disk added to archive). The program that we need is different from a regular File Duplicate Finder program because usually the File Duplicate Finder finds the duplicates from all sources comparing each file with another. We want to find the differences between the two sources. It is fine for us to have a report generated in text file which after this we'll use to do our move. A Windows solution will be preferred. Any ideas? TIA

    Read the article

  • CodePlex Daily Summary for Wednesday, February 02, 2011

    CodePlex Daily Summary for Wednesday, February 02, 2011Popular ReleasesTweetSharp: TweetSharp v2.0.0.0 - Preview 10: Documentation for this release may be found at http://tweetsharp.codeplex.com/wikipage?title=UserGuide&referringTitle=Documentation. Note: This code is currently preview quality. Preview 9 ChangesAdded support for trends Added support for Silverlight 4 Elevated WP7 fixes Third Party Library VersionsHammock v1.1.7: http://hammock.codeplex.com Json.NET 4.0 Release 1: http://json.codeplex.comFacebook C# SDK: 5.0.0 (BETA): This is our first BETA release of the version 5 branch of the Facebook C# SDK. Remember this is a BETA build. Some things may change or not work exactly as planned. We are absolutely looking for feedback on this release to help us improve the final 5.X.X release. This release contains some breaking changes. Particularly with authentication. After spending time reviewing the trouble areas that people are having using this SDK (and Facebook in general) we decided to spend a good deal of time w...Phalanger - The PHP Language Compiler for the .NET Framework: 2.0 (February 2011): Next release of Phalanger; again faster, more stable and ready for daily use. Based on many user experiences this release is one more step closer to be perfect compiler and runtime of your old PHP applications; or perfect platform for migrating to .NET. February 2011 release of Phalanger introduces several changes, enhancements and fixes. See complete changelist for all the changes. To improve the performance of your application using MySQL, please use Managed MySQL Extension for Phalanger....Chemistry Add-in for Word: Chemistry Add-in for Word - Version 1.0: On February 1, 2011, we announced the availability of version 1 of the Chemistry Add-in for Word, as well as the assignment of the open source project to the Outercurve Foundation by Microsoft Research and the University of Cambridge. System RequirementsHardware RequirementsAny computer that can run Office 2007 or Office 2010. Software RequirementsYour computer must have the following software: Any version of Windows that can run Office 2007 or Office 2010, which includes Windows XP SP3 and...StyleCop for ReSharper: StyleCop for ReSharper 5.1.15005.000: Applied patch from rodpl for merging of stylecop setting files with settings in parent folder. Previous release: A considerable amount of work has gone into this release: Huge focus on performance around the violation scanning subsystem: - caching added to reduce IO operations around reading and merging of settings files - caching added to reduce creation of expensive objects Users should notice condsiderable perf boost and a decrease in memory usage. Bug Fixes: - StyleCop's new Objec...Minecraft Tools: Minecraft Topographical Survey 1.4: MTS requires version 4 of the .NET Framework - you must download it from Microsoft if you have not previously installed it. This version of MTS adds MCRegion support and fixes bugs that caused rendering to fail for some users. New in this version of MTS: Support for rendering worlds compressed with MCRegion Fixed rendering failure when encountering non-NBT files with the .dat extension Fixed rendering failure when encountering corrupt NBT files Minor GUI updates Note that the command...MVC Controls Toolkit: Mvc Controls Toolkit 0.8: Fixed the following bugs: *Variable name error in the jvascript file that prevented the use of the deleted item template of the Datagrid *Now after the changes applied to an item of the DataGrid are cancelled all input fields are reset to the very initial value they had. *Other minor bugs. Added: *This version is available both for MVC2, and MVC 3. The MVC 3 version has a release number of 0.85. This way one can install both version. *Client Validation support has been added to all control...Office Web.UI: Beta preview (Source): This is the first Beta. it includes full source code and all available controls. Some designers are not ready, and some features are not finalized allready (missing properties, draft styles) ThanksASP.net Ribbon: Version 2.2: This release brings some new controls (part of Office Web.UI). A few bugs are fixed and it includes the "auto resize" feature as you resize the window. (It can cause an infinite loop when the window is too reduced, it's why this release is not marked as "stable"). I will release more versions 2.3, 2.4... until V3 which will be the official launch of Office Web.UI. Both products will evolve at the same speed. Thanks.Barcode Rendering Framework: 2.1.1.0: Finally fixed bugs with code 128 symbology. It was envisioned that this would be the last release to target VS2008 but support will continue due in no small part to a desire to add SSRS support in the future.xUnit.net - Unit Testing for .NET: xUnit.net 1.7: xUnit.net release 1.7Build #1540 Important notes for Resharper users: Resharper support has been moved to the xUnit.net Contrib project. Important note for TestDriven.net users: If you are having issues running xUnit.net tests in TestDriven.net, especially on 64-bit Windows, we strongly recommend you upgrade to TD.NET version 3.0 or later. This release adds the following new features: Added support for ASP.NET MVC 3 Added Assert.Equal(double expected, double actual, int precision) Ad...Spark View Engine: Spark v1.5: Release Notes There have been a lot of minor changes going on since version 1.1, but most important to note are the major changes which include: Support for HTML5 "section" tag. Spark has now renamed its own section tag to "segment" instead to avoid clashes. You can still use "section" in a Spark sense for legacy support by specifying ParseSectionAsSegment = true if needed while you transition Bindings - this is a massive feature that further simplifies your views by giving you a powerful ...WPF Application Framework (WAF): WPF Application Framework (WAF) 2.0.0.3: Version: 2.0.0.3 (Milestone 3): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Remark The sample applications are using Microsoft’s IoC container MEF. However, the WPF Application Framework (WAF) doesn’t force you to use the same IoC container in your application. You can use ...Rawr: Rawr 4.0.17 Beta: Rawr is now web-based. The link to use Rawr4 is: http://elitistjerks.com/rawr.phpThis is the Cataclysm Beta Release. More details can be found at the following link http://rawr.codeplex.com/Thread/View.aspx?ThreadId=237262 and on the Version Notes page: http://rawr.codeplex.com/wikipage?title=VersionNotes As of the 4.0.16 release, you can now also begin using the new Downloadable WPF version of Rawr!This is a pre-alpha release of the WPF version, there are likely to be a lot of issues. If you...Squiggle - A Free open source LAN Messenger: Squiggle 2.5 Beta: In this release following are the new features: Localization: Support for Arabic, French, German and Chinese (Simplified) Bridge: Connect two Squiggle nets across the WAN or different subnets Aliases: Special codes with special meaning can be embedded in message like (version),(datetime),(time),(date),(you),(me) Commands: cls, /exit, /offline, /online, /busy, /away, /main Sound notifications: Get audio alerts on contact online, message received, buzz Broadcast for group: You can ri...VivoSocial: VivoSocial 7.4.2: Version 7.4.2 of VivoSocial has been released. If you experienced any issues with the previous version, please update your modules to the 7.4.2 release and see if they persist. If you have any questions about this release, please post them in our Support forums. If you are experiencing a bug or would like to request a new feature, please submit it to our issue tracker. Web Controls * Updated Business Objects and added a new SQL Data Provider File. Groups * Fixed a security issue whe...PHP Manager for IIS: PHP Manager 1.1.1 for IIS 7: This is a minor release of PHP Manager for IIS 7. It contains all the functionality available in 56962 plus several bug fixes (see change list for more details). Also, this release includes Russian language support. SHA1 codes for the downloads are: PHPManagerForIIS-1.1.0-x86.msi - 6570B4A8AC8B5B776171C2BA0572C190F0900DE2 PHPManagerForIIS-1.1.0-x64.msi - 12EDE004EFEE57282EF11A8BAD1DC1ADFD66A654BloodSim: WControls.dll update: Priority Update It's just come to my attention that the latest version of WControls.dll was not included in the 1.4 release and as a result, BloodSim has been unuseable. Please download WControls.dll from here, and this will rectify the issue.VFPX: VFP2C32 2.0.0.8 Release Candidate: This release includes several bugfixes, new functions and finally a CHM help file for the complete library.DB>doc for Microsoft SQL Server: 1.0.0.0: Initial release Supported output HTML WikiPlex markup Raw XML Supported objects Tables Primary Keys Foreign Keys ViewsNew ProjectsAlay Plugin for Windows Live Writer: a project for creating an alay languageAnalyse Rapide des Droits sur une Arborescence du Système de Fichiers: Outil permettant une analyse simple sur une arborescence du système de fichier. L'idée étant de repérer rapidement les parties de l'arborescence qui changent de droits (arrêt de la propagation des droits). Developpement en C#, Interface WPF.Asp.Net Chat 4 All: Asp.Net Chat Application made easy By VaibhavBlogEngine.NET Image Picker: The image picker for BlogEngine.NET An utitilty to pick previously uploaded images in BlogEngine.NET blog posts. CMS for the kubert.info web-site: CMS for the kubert.info web-siteCommonRepositoryInterface: <project name> is a interface for abstracting away the details of the underlying data store behind a repositoryDragon Library Lite: This is a free version of my game engine missing some of its components, and parts of some, to make it so I could put this up for free without giving away all my source code. Also, any developers that show an interest here may be able to get access to the full version and help meeBookInfoGrabber: eBookInfoGrabber is a simple program designed to quickly grab any possible information you could desire on a given eBook using ISBN-10, ISBN-13, or by Title. It uses the Sony eBook store to gather all the data, and is developed in C#.Japanese Character Sets - Input, Identification and Comparison: Basic application written in C# which demonstrates the input, identification and comparison of specific Hiragana (half- and full-width), Katakana (half- and full-width), Kanji and Romaji Japanese characters. May be useful for developers beginning a project for Japanese users.Lighting UP 2011: 2011 NDSU Capstone project. Developing a Windows Phone 7 service application.Localization for SharePoint: Localize for SharePoint gives you the flexibility of localizing any site collection to any language. You don't need to have any language packs installed on your server. Any page on your site will be instantly translated to your preferred language.mzXML Corrupted Scans Remover: This application makes fast error correction on mzXML files. Each empty scan is being replaced by previous one. Program is optimized to work with large files (>2GB) with very small amout of memory needed while processing.Net Rank BR: Um projeto para criar analises sobre o log do jogo Urban Terror, um centro de encontro de players para jogo em time, um editor e criador de configs para Urt.NGinn.BPM: NGinn.BPM - a BPM / Workflow engine for Microsoft.NetOmegaEngine: A simple, light weight and easy to use Game Engine created in XNA. Intended to help reduce time needed with coding and focus more on game creation.Open Intel: Open Intel (OI) is an accelerator for rapidly building open data solutions with business intelligence and spatial analysis capabilities.Peekaboo - Proxy Server StarterKit: Peekaboo is a proxy server starterkit made in Java. Its a HTTP proxy server with a Swing UI to peek into the application/browser <-> server communication. Red Arrow: Turn based space strategy game. Source is published, but assets are considered private. Developed with C# / XNA FrameworkRPG engine: RPG engine for RPG games!Sancer: This is a school project which aids the teachers/advisers in choosing a students class depending on their grade on the placement test. Scaffold it !: "Scaffold it !" is a Visual Studio 2010 extension that enable you to scaffold elements from your entities. It's a time saver tool that leverage T4 templates and Visual Studio extensibility. SharePoint 2010 Design Guides for Developers: This projects holds visual guides to help developers and software architects take the best decision when building SharePoint ApplicationsSmith Image Converter: An easy-to-use tool to convert images format. At present, it supports bmp, png and jpeg type conversion.TFS Process Dashboard Integration: Small tool which allows integration of Team Foundation Server source control with the Process Dashboard line counter.Tibco Team Foundation Server (TFS) plugin: This project creates a Team Foundation Server plugin for Tibco BW Designer. It can be used from Tibco Designer as RCS adapter and manipulates the project files that are stored-version controlled- in TFS. ubotia: ubotiaXNA 4.0 Content Compiler: <project name> Compile to .xnb your texture files, audio files and SpriteFont files without adding to the Content project. Very Usefull in tools to create content for a game.

    Read the article

  • PHP-CGI slowly gains memory despite lack of requests.

    - by Kyle
    I know by now that PHP-CGI will sit there and hog my memory but I'm not looking to reduce it, just a way to reset the processes every so often. I'm using PHP-CGI with FastCGI/Lighttpd. So the best way of saying this is how could I reset these processes to prevent these annoying memory leaks?

    Read the article

  • Alternative to Amazon's S3 service?

    - by Cory
    Just wondering if there is good alternative to Amazon's S3 service? I like S3 but the bandwidth cost is high. I looked at CouldFiles from Rackspace but the cost is even higher. I don't mind prepaying or having monthly payment in order to reduce the bandwidth cost greatly. Thank you for any help

    Read the article

< Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >