Search Results

Search found 6392 results on 256 pages for 'reduce duplicate'.

Page 24/256 | < Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >

  • Reduce file size for charts pasted from excel into word

    - by Steve Clanton
    I have been creating reports by copying some charts and data from an excel document into a word document. I am pasting into a content control, so i use ChartObject.CopyPicture in excel and ContentControl.Range.Paste in word. This is done in a loop: Set ws = ThisWorkbook.Worksheets("Charts") With ws For Each cc In wordDocument.ContentControls If cc.Range.InlineShapes.Count > 0 Then scaleHeight = cc.Range.InlineShapes(1).scaleHeight scaleWidth = cc.Range.InlineShapes(1).scaleWidth cc.Range.InlineShapes(1).Delete .ChartObjects(cc.Tag).CopyPicture Appearance:=xlScreen, Format:=xlPicture cc.Range.Paste cc.Range.InlineShapes(1).scaleHeight = scaleHeight cc.Range.InlineShapes(1).scaleWidth = scaleWidth ElseIf ... Next cc End With Creating these reports using Office 2007 yielded files that were around 6MB, but creating them (using the same worksheet and document) in Office 2010 yields a file that is around 10 times as large. After unzipping the docx, I found that the extra size comes from emf files that correspond to charts that are pasted in using VBA. Where they range from 360 to 900 KB before, they are 5-18 MB. And the graphics are not visibly better. I am able to CopyPicture with the format xlBitmap, and while that is somewhat smaller, it is larger than the emf generated by Office 2007 and noticeably poorer quality. Are there any other options for reducing the file size? Ideally, I would like to produce a file with the same resolution for the charts as I did using Office 2007. Is there any way that uses VBA only (without modifying the charts in the spreadsheet)?

    Read the article

  • How to reduce compile time with C++ templates

    - by Shane MacLaughlin
    I'm in the process of changing part of my C++ app from using an older C type array to a templated C++ container class. See this question for details. While the solution is working very well, each minor change I make to the templated code causes a very large amount of recompilation to take place, and hence drastically slows build time. Is there any way of getting template code out of the header and back into a cpp file, so that minor implementation changes don't cause major rebuilds?

    Read the article

  • file to map/reduce program

    - by vana
    Hi , I am working on extracting Parts Of speech (POS) using xml documents and I have a englishPCFG.ser.gz file which is used in extracting POS on xml files. I cannot send this .gz file as input in HDFS directory, but my program uses it for parsing xml files. The file is in my local directory. I am getting "File Not Found" error when I run my mapreduce program. How can i make it available to mapper? I tried placing the file in HDFS where my xml files are present. I also tried adding it in .jar along with class files but not luck. I tried to change the hdfs-default.xml with entry to local directory, still doesnt work. Please let me know how to make mapper read this file? Thank you,

    Read the article

  • Reduce the file size of Excel

    - by Ram
    Hi, I'm working in an excel application and providing a menu to the user to add a new worksheet in that excel application (Excel Workbook). The worksheet will be added once the user clicks the "OK" button and I'm using a template to add this worksheet (The template has lot of formatting and formulas in it) Lets say the file size is 10 MB after adding a worksheet if the workbook is saved. Then I close the Excel application and reopen it and save the file then the file size is getting reduced to 8 MB. Can anybody let me know what could be the reason for the same? Thanks, Ram

    Read the article

  • How reduce dll size again

    - by cemick
    My dll have been bigger multiplied up many times than early for some reason. I begining to size up the situation: A source hasn't changed. Debug information everywhere turned off. Dll use package "Pack", but not include in Runtime Packages options. I've compared new dll with old version dll thought the instrumentality of PE Explore. In new dll I find out many modules with prefix 'ec' implicitly imported unlike old dll. Package "Pack" using ecControls components Dll doesn't using explicitly call to ecControls units. Why ecControls units imported in dll? Have anybody some advice?

    Read the article

  • packaging sequences of png files in iPhone APP for animations to reduce bundle size

    - by Brad Smith
    Basically, I have an application that uses a flip-book style animation technique. I am simply cycling through around 1000 320x480 pngs at 12fps, and everything works really well. Except for the fact that 1000 images takes up a ton of disk space. Ideally I'd like to be able to compress these images as a movie file and pull out each frame as I need them, or simply play back a movie with frame by frame precision. Ideas?

    Read the article

  • How to reduce this IF-Else ladder in c#

    - by Rohit
    This is the IF -Else ladder which I have created to focus first visible control on my form.According to the requirement any control can be hidden on the form.So i had to find first visible control and focus it. if (ddlTranscriptionMethod.Visible) { ddlTranscriptionMethod.Focus(); } else if (ddlSpeechRecognition.Visible) { ddlSpeechRecognition.Focus(); } else if (!SliderControl1.SliderDisable) { SliderControl1.Focus(); } else if (ddlESignature.Visible) { ddlESignature.Focus(); } else { if (tblDistributionMethods.Visible) { if (chkViaFax.Visible) { chkViaFax.Focus(); } else if (chkViaInterface.Visible) { chkViaInterface.Focus(); } else if (chkViaPrint.Visible) { chkViaPrint.Focus(); } else { chkViaSelfService.Focus(); } } } Is there any other way of doing this. I thought using LINQ will hog the performance as i have to tranverse the whole page collection. I am deep on page which has masterpages.Please suggest.

    Read the article

  • Reduce text length to fit cell width in a smart manner

    - by Andrei Ciobanu
    Hello, I am in project where we are building a simple web calendar using Java EE technologies. We define a table where every row is an employee, and every column represents an hour interval. The table width and column widths are adjustable. In every cell we have a text retrieved from a database, indicating what the employee is doing / should do in that time interval. The problem is that sometimes the text in cells is getting bigger than the actual cell. My task is to make the text more "readable" by reducing it's length in a "smart way" so that it can fit in the cell more "gracefully". For example if initially in a cell I have: "Writing documents", after the resize I should retrieve: "Wrtng. dcmnts" or "Writ. docum." so that the text can fit well. Is there a smart way to do it ? Or removing vocals / split the string in two is enough ?

    Read the article

  • How to reduce the time of clang_complete search through boost

    - by kirill_igum
    I like using clang with vim. The one problem that I always have is that whenever I include boost, clang goes through boost library every time I put "." after a an object name. It takes 5-10 seconds. Since I don't make changes to boost headers, is there a way to cache the search through boost? If not, is there a way to remove boost from the auto-completion search? update (1) in response to answer by adaszko after :let g:clang_use_library = 1 I type a name of a variable. I press ^N. Vim starts to search through boost tree. it auto-completes the variable. i press "." and get the following errors: Error detected while processing function ClangComplete: line 35: Traceback (most recent call last): Press ENTER or type command to continue Error detected while processing function ClangComplete: line 35: File "<string>", line 1, in <module> Press ENTER or type command to continue Error detected while processing function ClangComplete: line 35: NameError: name 'vim' is not defined Press ENTER or type command to continue Error detected while processing function ClangComplete: line 40: E121: Undefined variable: l:res Press ENTER or type command to continue Error detected while processing function ClangComplete: line 40: E15: Invalid expression: l:res Press ENTER or type command to continue Error detected while processing function ClangComplete: line 58: E121: Undefined variable: l:res Press ENTER or type command to continue Error detected while processing function ClangComplete: line 58: E15: Invalid expression: l:res Press ENTER or type command to continue ... and there is no auto-compeltion update (2) not sure if clang_complete should take care of the issue with boost. vim without plugins does search through boost. superuser has an answer to comment out search through boost dirs with set include=^\\s*#\\s*include\ \\(<boost/\\)\\@!

    Read the article

  • Ruby: reduce duplication while initialize hash

    - by user612308
    array = [0, 0.3, 0.4, 0.2, 0.6] hash = { "key1" => array[0..2], "key2" => array[0..3], "key3" => array, "key4" => array, "key5" => array, "key6" => array, "key7" => array } Is there a way I can remove the duplication by doing something like hash = { "key1" => array[0..2], "key2" => array[0..3], %(key3, key4, key5, key6, key7).each {|ele| ele => array} }

    Read the article

  • Globally define parts of virtualhost definition to reduce redundancy

    - by user1428900
    I have setup an apache server and this apache server points to a bunch of virtualhosts. The definition of the virtualhosts are as follows, <VirtualHost *:80> ServerName <url> RewriteEngine on ReWriteCond %{SERVER_PORT} !^443$ RewriteRule ^/(.*) https://%{HTTP_HOST}/$1 [NC,R,L] # Enable client-side caching of resources ExpiresActive On ExpiresByType image/png "now plus 48 hours" ExpiresByType image/jpeg "now plus 48 hours" ExpiresByType image/gif "now plus 48 hours" ExpiresByType application/javascript "now plus 48 hours" ExpiresByType application/x-javascript "now plus 48 hours" ExpiresByType text/javascript "now plus 48 hours" ExpiresByType text/css "now plus 48 hours" </VirtualHost> . . . <VirtualHost *:80> ServerName <url> RewriteEngine on ReWriteCond %{SERVER_PORT} !^443$ RewriteRule ^/(.*) https://%{HTTP_HOST}/$1 [NC,R,L] # Enable client-side caching of resources ExpiresActive On ExpiresByType image/png "now plus 48 hours" ExpiresByType image/jpeg "now plus 48 hours" ExpiresByType image/gif "now plus 48 hours" ExpiresByType application/javascript "now plus 48 hours" ExpiresByType application/x-javascript "now plus 48 hours" ExpiresByType text/javascript "now plus 48 hours" ExpiresByType text/css "now plus 48 hours" </VirtualHost> I have a ton of virtual host definitions similar to the examples shown above. Since, most of the definitions other than the ServerName are the same, I was wondering if there was a way to define these common definitions globally. I am new to apache configuration and I felt that as the number of virtualhost definitions increase my configuration file becomes longer and redundant.

    Read the article

  • java - reduce external jar file size

    - by joe_shmoe
    Hi all, still learning, so be patient :) I've developed a module for a Java project. The module depends on external library (fastutil). the problem is, the fastutil.jar file is a couple of times heavier than the whole project itself (14 MB). I only use a tiny subset of the classes from the library. the module is now finished, and no-one is likely to extend it in future. is there a way I could extract only the relevant class to some fastutil_small.jar so that others don't have to download all this extra weight? there's probably a simple answer to this, but as I said, I still consider myself a noob. Thanks a lot

    Read the article

  • TIME REDUCE(OPTIMISE QUERY)

    - by user2527657
    select a.userid,(select firstName from user where userid=NOTUSED.userid) as z, (select max(login_time) from userLoginTime AS b where userid = a.user_id GROUP BY b.user_id ORDER BY b.user_id) as y From(SELECT DISTINCT a.user_id FROM user AS a LEFT OUTER JOIN (SELECT (userid) FROM userlogintime where serialid=15400012)AS b ON user.user_id = b.user_id where a.Serialid=15400012 AND b.userid IS NULL) NOTUSED, Relation r, user a where r.childuserid = NOTUSED.userid and guarduserid = a.userid

    Read the article

  • Duplicate DNS Zones (Error 4515 in Event Log )

    - by Campo
    I am getting these two error in the DNS Event log (errors at end of question). I have confirmed I do have duplicate zones. I am wondering which ones to delete. The DomainDNSZone contains all of our DNS records but it does not have the _msdcs zone.... that is in the ForestDNSZone with the duplicates that are not in use. here is a picture of that 3 Questions. I understand the advantages of having DNS in the ForestDNSZone. so... Why is DNS using the DomainDNSZone and is that acceptable considering _msdcs... is in the ForestDNSZone? If so, should I just delete the DC=1.168.192.in-addr.arpa and DC=supernova.local from the ForestDNSZone? Or should I try to get those to be the ones in use? What are those steps? I understand how to delete. That is simple but if i must move zones some info would be appreaciated there. Just to confirm. from my understanding. I can delete the two duplicates in the ForestDNSZone and leave the _msdcs.supernova.local as thats required there. This will resolve the erros I see. Just fyi when I look in those folders from the ForestDNSZone they have just 2 and 1 entries respectively. So obviously not in use compared to the others. I am pretty sure I understand the steps to complete this. But if you would like to provide that info, bonus points! Event Type: Warning Event Source: DNS Event Category: None Event ID: 4515 Date: 1/4/2011 Time: 2:14:18 PM User: N/A Computer: STANLEY Description: The zone 1.168.192.in-addr.arpa was previously loaded from the directory partition DomainDnsZones.supernova.local but another copy of the zone has been found in directory partition ForestDnsZones.supernova.local. The DNS Server will ignore this new copy of the zone. Please resolve this conflict as soon as possible. If an administrator has moved this zone from one directory partition to another this may be a harmless transient condition. In this case, no action is necessary. The deletion of the original copy of the zone should soon replicate to this server. If there are two copies of this zone in two different directory partitions but this is not a transient caused by a zone move operation then one of these copies should be deleted as soon as possible to resolve this conflict. To change the replication scope of an application directory partition containing DNS zones and for more details on storing DNS zones in the application directory partitions, please see Help and Support. For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp. Data: 0000: 89 25 00 00 %.. AND Event Type: Warning Event Source: DNS Event Category: None Event ID: 4515 Date: 1/4/2011 Time: 2:14:18 PM User: N/A Computer: STANLEY Description: The zone supernova.local was previously loaded from the directory partition DomainDnsZones.supernova.local but another copy of the zone has been found in directory partition ForestDnsZones.supernova.local. The DNS Server will ignore this new copy of the zone. Please resolve this conflict as soon as possible. If an administrator has moved this zone from one directory partition to another this may be a harmless transient condition. In this case, no action is necessary. The deletion of the original copy of the zone should soon replicate to this server. If there are two copies of this zone in two different directory partitions but this is not a transient caused by a zone move operation then one of these copies should be deleted as soon as possible to resolve this conflict. To change the replication scope of an application directory partition containing DNS zones and for more details on storing DNS zones in the application directory partitions, please see Help and Support. For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp. Data: 0000: 89 25 00 00 %..

    Read the article

  • duplicate cache pages: Varnish

    - by Sukhjinder Singh
    Recently we have configured Varnish on our server, it was successfully setup but we noticed that if we open any page in multiple browsers, the Varnish send request to Apache not matter page is cached or not. If we refresh twice on each browser it creates duplicate copies of the same page. What exactly should happen: If any page is cached by Varnish, the subsequent request should be served from Varnish itself when we are opening the same page in browser OR we are opening that page from different IP address. Following is my default.vcl file backend default { .host = "127.0.0.1"; .port = "80"; } sub vcl_recv { if( req.url ~ "^/search/.*$") { }else { set req.url = regsub(req.url, "\?.*", ""); } if (req.restarts == 0) { if (req.http.x-forwarded-for) { set req.http.X-Forwarded-For = req.http.X-Forwarded-For + ", " + client.ip; } else { set req.http.X-Forwarded-For = client.ip; } } if (!req.backend.healthy) { unset req.http.Cookie; } set req.grace = 6h; if (req.url ~ "^/status\.php$" || req.url ~ "^/update\.php$" || req.url ~ "^/admin$" || req.url ~ "^/admin/.*$" || req.url ~ "^/flag/.*$" || req.url ~ "^.*/ajax/.*$" || req.url ~ "^.*/ahah/.*$") { return (pass); } if (req.url ~ "(?i)\.(pdf|asc|dat|txt|doc|xls|ppt|tgz|csv|png|gif|jpeg|jpg|ico|swf|css|js)(\?.*)?$") { unset req.http.Cookie; } if (req.http.Cookie) { set req.http.Cookie = ";" + req.http.Cookie; set req.http.Cookie = regsuball(req.http.Cookie, "; +", ";"); set req.http.Cookie = regsuball(req.http.Cookie, ";(SESS[a-z0-9]+|SSESS[a-z0-9]+|NO_CACHE)=", "; \1="); set req.http.Cookie = regsuball(req.http.Cookie, ";[^ ][^;]*", ""); set req.http.Cookie = regsuball(req.http.Cookie, "^[; ]+|[; ]+$", ""); if (req.http.Cookie == "") { unset req.http.Cookie; } else { return (pass); } } if (req.request != "GET" && req.request != "HEAD" && req.request != "PUT" && req.request != "POST" && req.request != "TRACE" && req.request != "OPTIONS" && req.request != "DELETE") {return(pipe);} /* Non-RFC2616 or CONNECT which is weird. */ if (req.request != "GET" && req.request != "HEAD") { return (pass); } if (req.http.Accept-Encoding) { if (req.url ~ "\.(jpg|png|gif|gz|tgz|bz2|tbz|mp3|ogg)$") { # No point in compressing these remove req.http.Accept-Encoding; } else if (req.http.Accept-Encoding ~ "gzip") { set req.http.Accept-Encoding = "gzip"; } else if (req.http.Accept-Encoding ~ "deflate") { set req.http.Accept-Encoding = "deflate"; } else { # unknown algorithm remove req.http.Accept-Encoding; } } return (lookup); } sub vcl_deliver { if (obj.hits > 0) { set resp.http.X-Varnish-Cache = "HIT"; } else { set resp.http.X-Varnish-Cache = "MISS"; } } sub vcl_fetch { if (beresp.status == 404 || beresp.status == 301 || beresp.status == 500) { set beresp.ttl = 10m; } if (req.url ~ "(?i)\.(pdf|asc|dat|txt|doc|xls|ppt|tgz|csv|png|gif|jpeg|jpg|ico|swf|css|js)(\?.*)?$") { unset beresp.http.set-cookie; } set beresp.grace = 6h; } sub vcl_hash { hash_data(req.url); if (req.http.host) { hash_data(req.http.host); } else { hash_data(server.ip); } return (hash); } sub vcl_pipe { set req.http.connection = "close"; } sub vcl_hit { if (req.request == "PURGE") {ban_url(req.url); error 200 "Purged";} if (!obj.ttl > 0s) {return(pass);} } sub vcl_miss { if (req.request == "PURGE") {error 200 "Not in cache";} }

    Read the article

  • How to Avoid Duplicate Key Exception

    - by LifeH2O
    I am using TableAdapter to insert records in table within a loop. foreach(....) { .... .... teamsTableAdapter.Insert(_teamid, _teamname); .... } Where TeamID is the primary key in the table After first run of this loop, Insert throws Duplicate Primary Key found Exception. To handle this, i have done this foreach(....) { .... .... try { _teamsTableAdapter.Insert(_teamid, _teamname); } catch (System.Data.SqlClient.SqlException e) { if (e.Number != 2627) MessageBox.Show(e.Message); } .... .... } But using try catch statement is costly, how to avoid this exception. I am working in VS2010 and INSERT ... ON DUPLICATE KEY UPDATE does not work.

    Read the article

  • Scheme: Detecting duplicate elements in a list

    - by Kyle Krull
    Does R6RS or Chez Scheme v7.9.4 have a library function to check if a list contains duplicate elements? Alternatively, do either have any built in functionality for sets (which dis-allow duplicate elements)? So far, I've only been able to find an example here. The problem with that is that it doesn't appear to actually be part of the Chez Scheme library. Although I could write my own version of this, I'd much rather use a well known, tested, and maintained library function - especially given how basic an operation this is. So a simple "use these built-in functions" or a "no built-in library implements this" will suffice. Thanks!

    Read the article

  • Controllling Drupal's active/active-trail with duplicate menu items

    - by Mark
    I'm developing a site that requires some duplication of links within the menu: Section A -- Introduction -- Testimonials Section B -- Introduction -- Testimonials Testimonials -- Section A -- Section B So 'Section A Testimonials' and 'Testimonials Section A' point to the same node. But regardless of which menu link people use, I want the person to be in Section A. The problem is that D6 doesn't like duplicate menu items, and it assigns the active and active-trail classes rather unpredictably. So my thought was to create a placeholder node for each item in the Testimonials menu, and then set the URL to something like "testimonials/redirect/section-a", and then use mod_rewrite to redirect over to "section-a/testimonials". With this solution, I will have no duplicate paths in the menu. I'm just hoping this doesn't somehow hurt my SEO. Does anyone know a better solution?

    Read the article

  • jQuery.load() Retrieving partial page content causes duplicate ID in DOM

    - by Warren Buckley
    Hello all, I currently have a JS function that allows me to load partial content from another page into the current page using jQuery.load() However I noticed when using this that I get a duplicate ID in the DOM (when inspecting with FireBug) This function is used in conjuction with a Flash Building viewe so it does not contain any code to retrieve the URL from the anchor tag. function displayApartment(apartmentID) { $("#apartmentInfo").load(siteURL + apartmentID + ".aspx #apartmentInfo", function () { //re-do links loaded in $("a.gallery").colorbox({ opacity: "0.5" }); }); } The code above works just fine in retrieving the content, however it just bugs me that when inspecting with firebug I get something like this. <div id="apartmentInfo"> <div id="apartmentInfo"> <!-- Remote HTML here..... --> </div> </div> Can anyone please suggest anything on how to remove the duplicate div? Thanks, Warren

    Read the article

  • How to copy or duplicate gtk widgets?

    - by PP
    Hi, How to copy or duplicate gtk widgets? In my application I have one huge GtkComboBox created with one long for loop which eats up so much of time and I am using this combo at two places in one single screen. So, what I want to do is create this combo for one time and duplicate/copy it in another one so it will save my time. If I try to add same combo box pointer two times, gtk gives me error "child-paren != NULL" cause in gtk widget can have only single parent. So what to do? Thanks, PP.

    Read the article

  • How to check for duplicate files?

    - by miorel
    I have an external hard drive on which I have backed up files several times. Some files were modified between backups, others were not. Some may have been renamed. Now I'm running out of space, and I'd like to clean up duplicate files. My idea was to md5sum every file on the drive, then look for duplicates, and diff the relevant files (just in case, haha). Is this the best way to do this? What are some other methods of checking for duplicate files?

    Read the article

  • prevent Duplicate values using Jquery Validation

    - by Yashwant Chavan
    I have form and form text field which generates dynamically using JSP. And I using Jquery validation but want to add functionlaty to prevent duplicate entry in the form. E.g. <form name="myForm" id="myForm"> <input type="text" name="text1" id="text1"> <input type="text" name="text2" id="text2"> <input type="text" name="text3" id="text3"> = = N number of form fields <input type="text" name="textn" id="textn"> </form> I want to check if there is any duplicate value entered in the textfield using jquery validation. Thanks

    Read the article

  • keep duplicate number records only - perl

    - by manu
    Hello I have one text string which is having some duplicate characters (FFGGHHJKL), these can be made unique by using the positive lookahead [perl script for the same$ perl -pe 's/(.)(?=.*?\1)//g']. (FFEEDDCCGG OUTPUT == FEDCG) My question is how to make it work on the numbers (Ex. 212 212 43 43 5689 6689 5689 71 81 === output should be 212 43 5689 6689 71 81) ? Also if we want to have only duplicate records to be given as the output from a file having n rows (212 212 43 43 5689 6689 5689 71 81 \n 66 66 67 68 69 69 69 71 71 52 ..\n .. .. \n... OUTPUT == 212 212 43 43 5689 5689 \n 66 66 69 69 69 71 71) then what should be done ? Thanks and regards -manu

    Read the article

< Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >