Search Results

Search found 23131 results on 926 pages for 'ms query'.

Page 557/926 | < Previous Page | 553 554 555 556 557 558 559 560 561 562 563 564  | Next Page >

  • Sonata Media Bundle sortBy created_at

    - by tony908
    I use SonataMediaBundle and i would like sort Gallery by created_at field. In repository class i have (without orderBy working good!): $qb = $this->createQueryBuilder('m') ->orderBy('j.expires_at', 'DESC'); $query = $qb->getQuery(); return $query->getResult(); and this throw error: An exception has been thrown during the rendering of a template ("[Semantical Error] line 0, col 80 near 'created_at D': Error: Class Application\Sonata\MediaBundle\Entity\Gallery has no field or association named created_at") so i add this field to Gallery class: /** * @var \DateTime */ private $created_at; /** * Set created_at * * @param \DateTime $createdAt * @return Slider */ public function setCreatedAt($createdAt) { $this->created_at = $createdAt; return $this; } /** * Get created_at * * @return \DateTime */ public function getCreatedAt() { return $this->created_at; } but now i have error: FatalErrorException: Compile Error: Declaration of Application\Sonata\MediaBundle\Entity\Gallery::setCreatedAt() must be compatible with Sonata\MediaBundle\Model\GalleryInterface::setCreatedAt(DateTime $createdAt = NULL) in /home/tony/www/test/Application/Sonata/MediaBundle/Entity/Gallery.php line 32 GalleryInterface: https://github.com/sonata-project/SonataMediaBundle/blob/master/Model/GalleryInterface.php So... how can i use sortBy in my example?

    Read the article

  • What can stop IIS7 from restarting an ASP.NET app when uppdating a dll in the bin folder?

    - by Carl Björknäs
    We're running ASP.NET 2.0 on MS Server 2008 and IIS 7. During the last releases the app pool hasn't automatically been restarted after changes in the bin folder. It works like a charm on our test server but not on the live server. The site is browsable but runs with the logic of the old version of the updated dll. One of the changes we have done lately is that one of the dll:s in the bin folder consists of other dlls that have been merged with ILMerge. Interop.ADODB.dll and Interop.CDO.dll is included in the merged dll. It is the user dll of the merged dll that is updated. What can possibly hinder IIS from restarting the app pool although a file has changed in the bin folder?

    Read the article

  • C# Dataset Dynamically Add DataColumn

    - by Wesley
    I am trying to add a extra column to a dataset after a query has completed. I have a database relationship of the following: Employees / \ Groups EmployeeGroups Empoyees holds all the data for that individual, I'll name the unique key the UserID. Groups holds all the groups that a employee can be a part of, i.e. Super User, Admin, User; etc. I will name the unique key GroupID EmployeeGroups holds all the associations of which groups each employee belongs too. (UserID | GroupID) What I am trying to accomplish is after querying for a all users I want to loop though each user and add what groups that user is a part of by adding a new column to the dataset named 'Groups' which is a string to insert the values of the next query to get all the groups that user is a part of. Then by user of databinding populate a listview with all employees and their group associations My code is as follows; Position 5 is the new column I am trying to add to the dataset. string theQuery = "select UserID, FirstName, LastName, EmployeeID, Active from Employees"; DataSet theEmployeeSet = itsDatabase.runQuery(theQuery); DataColumn theCol = new DataColumn("Groups", typeof(string)); theEmployeeSet.Tables[0].Columns.Add(theCol); foreach (DataRow theRow in theEmployeeSet.Tables[0].Rows) { theRow.ItemArray[5] = "1234"; } At the moment, the code will create the new column but when i assign the data to that column nothing will be assigned, what am I missing? If there is any further explination or information I can provide, please let me know. Thank you all

    Read the article

  • SugarmCRM REST API always returns "null"

    - by TuomasR
    I'm trying to test out the SugarCRM REST API, running latest version of CE (6.0.1). It seems that whenever I do a query, the API returns "null", nothing else. If I omit the parameters, then the API returns the API description (which the documentation says it should). I'm trying to perform a login, passing as parameter the method (login), input_type and response_type (json) and rest_data (JSON encoded parameters). The following code does the query: $api_target = "http://example.com/sugarcrm/service/v2/rest.php"; $parameters = json_encode(array( "user_auth" => array( "user_name" => "admin", "password" => md5("adminpassword"), ), "application_name" => "Test", "name_value_list" => array(), )); $postData = http_build_query(array( "method" => "login", "input_type" => "json", "response_type" => "json", "rest_data" => $parameters )); echo $parameters . "\n"; echo $postData . "\n"; echo file_get_contents($api_target, false, stream_context_create(array( "http" => array( "method" => "POST", "header" => "Content-Type: application/x-www-form-urlencoded\r\n", "content" => $postData ) ))) . "\n"; I've tried different variations of parameters and using username instead of user_name, and all provide the same result, just a response "null" and that's it.

    Read the article

  • Is the following valid XHTML 1.0 Transitional?

    - by willem
    The w3c validator service complains that the following html is invalid. It does not like the ampersand(&) in my javascript. But ampersands are allowed in javascript strings, aren't they? <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title>Page Title</title> </head> <body> <script type="text/javascript"> function search(query) { redir = "http://search.mysite.com/search?s=FIN&ref=&q=" + query; window.location.href = redir return false; } </script> <span>This is all valid HTML</span> </body> </html>

    Read the article

  • Form validation

    - by kielie
    Hi guys, I need to create a form that has many of the same fields, that have to be inserted into a database, but the problem I have is that if a user only fills in one or two of the rows, the form will still submit the blank data of the empty fields along with the one or two fields the user has filled in. How can I check for the rows that have not been filled in and leave them out of the query? or check for those that have been filled in and add them to the query. . . The thank_you.php file will capture the $_POST variables and add them to the database. <form method="post" action="thank_you.php"> Name: <input type="text" size="28" name="name1" /> E-mail: <input type="text" size="28" name="email1" /> <br /> Name: <input type="text" size="28" name="name2" /> E-mail: <input type="text" size="28" name="email2" /> <br /> Name: <input type="text" size="28" name="name3" /> E-mail: <input type="text" size="28" name="email3" /> <br /> Name: <input type="text" size="28" name="name4" /> E-mail: <input type="text" size="28" name="email4" /> <input type="image" src="images/btn_s.jpg" /> </form> I am assuming that I could use javascript or jQuery to accomplish this, how would I go about doing this? Thanx in advance for the help.

    Read the article

  • Ruby-on-Rails: Multiple has_many :through possible?

    - by williamjones
    Is it possible to have multiple has_many :through relationships that pass through each other in Rails? I received the suggestion to do so as a solution for another question I posted, but have been unable to get it to work. Friends are a cyclic association through a join table. The goal is to create a has_many :through for friends_comments, so I can take a User and do something like user.friends_comments to get all comments made by his friends in a single query. class User has_many :friendships has_many :friends, :through => :friendships, :conditions => "status = #{Friendship::FULL}" has_many :comments has_many :friends_comments, :through => :friends, :source => :comments end class Friendship < ActiveRecord::Base belongs_to :user belongs_to :friend, :class_name => "User", :foreign_key => "friend_id" end This looks great, and makes sense, but isn't working for me. This is the error I'm getting in relevant part when I try to access a user's friends_comments: ERROR: column users.user_id does not exist : SELECT "comments".* FROM "comments" INNER JOIN "users" ON "comments".user_id = "users".id WHERE (("users".user_id = 1) AND ((status = 2))) When I just enter user.friends, which works, this is the query it executes: : SELECT "users".* FROM "users" INNER JOIN "friendships" ON "users".id = "friendships".friend_id WHERE (("friendships".user_id = 1) AND ((status = 2))) So it seems like it's entirely forgetting about the original has_many through friendship relationship, and then is inappropriately trying to use the User class as a join table. Am I doing something wrong, or is this simply not possible?

    Read the article

  • MYSQL: Limit Word Length for MySql Insert

    - by elmaso
    Hi, every search query is saved in my database, but I want to Limit the Chracterlength for one single word: odisafuoiwerjsdkle -- length too much -- dont write in the database my actually code is: $search = $_GET['q']; if (!($sql = mysql_query ('' . 'SELECT * FROM `history` WHERE `Query`=\'' . $search . '\''))) { exit ('<b>SQL ERROR:</b> 102, Cannot write history.'); ; } while ($row = mysql_fetch_array ($sql)) { $ID = '' . $row['ID']; } if ($ID == '') { mysql_query ('' . 'INSERT INTO history (Query) values (\'' . $search . '\')'); } if (!($sql = mysql_query ('SELECT * FROM `history` ORDER BY `ID` ASC LIMIT 1'))) { exit ('<b>SQL ERROR:</b> 102, Cannot write history.'); ; } while ($row = mysql_fetch_array ($sql)) { $first_id = '' . $row['ID']; } if (!($sql = mysql_query ('SELECT * FROM `history`'))) { exit ('<b>SQL ERROR:</b> 102, Cannot write history.'); ; }

    Read the article

  • Fastest way to become a MySQL expert?

    - by Kerry
    I have been using MySQL for years, mainly on smaller projects until the last year or so. I'm not sure if it's the nature of the language or my lack of real tutorials that gives me the feeling of being unsure if what I'm writing is the proper way for optimization purposes and scaling purposes. While self-taught in PHP I'm very sure of myself and the code I write, easily can compare it to others and so on. With MySQL, I'm not sure whether (and in what cases) an INNER JOIN or LEFT JOIN should be used, nor am I aware of the large amount of functionality that it has. While I've written code for databases that handled tens of millions of records, I don't know if it's optimum. I often find that a small tweak will make a query take less than 1/10 of the original time... but how do I know that my current query isn't also slow? I would like to become completely confident in this field in the ability to optimize databases and be scalable. Use is not a problem -- I use it on a daily basis in a number of different ways. So, the question is, what's the path? Reading a book? Website/tutorials? Recommendations?

    Read the article

  • Get an yerror plot without a line in Octave

    - by queueoverflow
    I'd like to print a plot with y-error-bars and just plain points. My current Octave script looks like this: errorbar(x_list, y_list, Delta_y_list, "~.x"); title("physikalisches Pendel"); xlabel("a^2 [m^2]"); ylabel("aT^2 [ms^2]"); print -dpdf plot.pdf The plot I get has a line, although I specified the .x style option: How can I get rid of that line? And the ylabel is in the scale as well, is there some way to fix that?

    Read the article

  • How can I disrupt my roommate's BitTorrent?

    - by bob
    We're on a 50 mb/s Comcast connection and our connection right now is coming in under 1.5 mb/s. Our roommate left for a week with BitTorrent running (Azureus client, we think). Our latency is approaching 300 ms. His door is locked up tight, and both his machine and the router for the house are located inside. I've even flipped the power breaker in the house and that barely works for 2 minutes. His laptop keeps on running, and once the cable modem and router come back up and the machine reconnects, the torrents resume in earnest. I've been running nmap and identified his IP on our LAN. Is there anything I can do over the LAN to make his torrents start to fail or slow down?

    Read the article

  • Exclude pings from apache error logs (ran from PHP exec)

    - by fooraide
    Now, for a number of reasons I need to ping several hosts on a regular basis for a dashboard display. I use this PHP function to do it: function PingHost($strIpAddr) { exec(escapeshellcmd('ping -q -W 1 -c 1 '.$strIpAddr), $dataresult, $returnvar); if (substr($dataresult[4],0,3) == "rtt") { //We got a ping result, lets parse it. $arr = explode("/",$dataresult[4]); return ereg_replace(" ms","",$arr[4]); } elseif (substr($dataresult[3],35,16) == "100% packet loss") { //Host is down! return "Down"; } elseif ($returnvar == "2") { return "No DNS"; } } The problem is that whenever there is an unknown host, I will get an error logged to my apache error log (/var/log/apache/error.log). How would I go about disabling logs for this particular function ? Disabling logs in the vhost is not an option since logs for that vhost are relevant, just not the pings. Thanks,

    Read the article

  • Getting a Specified Cast is not valid while importing data from Excel using Linq to SQL

    - by niceoneishere
    This is my second post. After learning from my first post how fantastic is to use Linq to SQL, I wanted to try to import data from a Excel sheet into my SQL database. First My Excel Sheet: it contains 4 columns namely ItemNo ItemSize ItemPrice UnitsSold I have a created a database table with the following fields table name ProductsSold Id int not null identity --with auto increment set to true ItemNo VarChar(10) not null ItemSize VarChar(4) not null ItemPrice Decimal(18,2) not null UnitsSold int not null Now I created a dal.dbml file based on my database and I am trying to import the data from excel sheet to db table using the code below. Everything is happening on click of a button. private const string forecast_query = "SELECT ItemNo, ItemSize, ItemPrice, UnitsSold FROM [Sheet1$]"; protected void btnUpload_Click(object sender, EventArgs e) { var importer = new LinqSqlModelImporter(); if (fileUpload.HasFile) { var uploadFile = new UploadFile(fileUpload.FileName); try { fileUpload.SaveAs(uploadFile.SavePath); if(File.Exists(uploadFile.SavePath)) { importer.SourceConnectionString = uploadFile.GetOleDbConnectionString(); importer.Import(forecast_query); gvDisplay.DataBind(); pnDisplay.Visible = true; } } catch (Exception ex) { Response.Write(ex.Source.ToString()); lblInfo.Text = ex.Message; } finally { uploadFile.DeleteFileNoException(); } } } // Now here is the code for LinqSqlModelImporter public class LinqSqlModelImporter : SqlImporter { public override void Import(string query) { // importing data using oledb command and inserting into db using LINQ to SQL using (var context = new WSDALDataContext()) { using (var myConnection = new OleDbConnection(base.SourceConnectionString)) using (var myCommand = new OleDbCommand(query, myConnection)) { myConnection.Open(); var myReader = myCommand.ExecuteReader(); while (myReader.Read()) { context.ProductsSolds.InsertOnSubmit(new ProductsSold() { ItemNo = myReader.GetString(0), ItemSize = myReader.GetString(1), ItemPrice = myReader.GetDecimal(2), UnitsSold = myReader.GetInt32(3) }); } } context.SubmitChanges(); } } } can someone please tell me where am I making the error or if I am missing something, but this is driving me nuts. When I debugged I am getting this error when casting from a number the value must be a number less than infinity I really appreciate it

    Read the article

  • Can I replace my notebook's fan without replacing the heat sink?

    - by kuzzooroo
    My laptop (Acer Travelmate 8204) has started making a grinding noise, which seems to be coming from the fan (not the hard drive). Some articles online mention replacing the fan, but others imply that one should replace both the fan and the heat sink. Does it make sense to replace the fan only? (addendum: does this procedure really take 3 hours for a noob, as this video implies? UPDATE: the fan certainly seems to be on the "top" layer when I open my laptop but some parts of it extend under other layers) Some additional info in response to comments: I have run MS scandisk on the hard drive a few times since the grinding started. It usually reports finding something or other but the computer's behavior doesn't change I'm pretty sure it's the fan based on where the noise is coming from. I employed the cardboard tube trick from the PC World article.

    Read the article

  • Why does HDTune report better performing drives 2 months after installing them?

    - by Rolnik
    OK, so this is really weird. I ran HDTune on a newly set-up home-built computer and got the following readings from my drives in mid-November. SSD 154 MB/s RAID1 87 RAID0 198 (software installs) RAID0 98 Swap drive Today, in January, I run HDTune (same version) and get these results, in MB/s: SSD 186 RAID1 98 RAID0 241 RAID0 98 (Swap drive) Here are more details that HDTune reports on the SSD drive: HD Tune: OCZ-VERTEX Benchmark Blockquote Transfer Rate Minimum : 135.4 MB/sec Transfer Rate Maximum : 219.4 MB/sec Transfer Rate Average : 185.7 MB/sec Access Time : 0.1 ms Burst Rate : 187.3 MB/sec CPU Usage : -1.0% To get to my question: Why are my hard drives improving in performance? Most of my logical drives are in some form of RAID, except for the SSD. Will this performance ever deteriorate? Note, none of my drives are a hybrid drive that uses some form of SSD to enhance the write/reads on actual platters.

    Read the article

  • Automatic document generation

    - by Bowler
    I have some data in an excel file from which I have to generate a report. I repeat this task fairly regularly and am looking to automate it. I have a LaTeX project into which I usually just copy data by hand, export the necessary worksheets as pdfs and add them to my LaTeX project and compile with pdflatex. It has occured to me that there must be a way to automate this process. Is there an efficient way to export the data from excel and into a LaTeX project, possibly a vba script in excel could run the process? Also, it doesn't have to be LaTeX, I'm not all that experienced with MS office's more advanced features is there some way akin to a mail merge that I could achieve this with? In some ways this might be better in case I have to pass the work on to someone who doesn't know LaTeX. Thanks.

    Read the article

  • Get the equivalent time between "dynamic" time zones

    - by doctore
    I have a table providers that has three columns (containing more columns but not important in this case): starttime, start time in which you can contact him. endtime, final hour in which you can contact him. region_id, region where the provider resides. In USA: California, Texas, etc. In UK: England, Scotland, etc starttime and endtime are time without timezone columns, but, "indirectly", their value has time zone of the region in which the provider resides. For example: starttime | endtime | region_id (time zone of region) | "real" st | "real" et ----------|----------|---------------------------------|-----------|----------- 03:00:00 | 17:00:00 | 1 (EGT => -1) | 02:00:00 | 16:00:00 Often I need to get the list of suppliers whose time range is within the current server time (taking into account the time zone conversion). The problem is that the time zones aren't "constant", ie, they may change during the summer time. However, this change is very specific to the region and not always carried out at the same time: EGT <= EGST, ART <= ARST, etc. The question is: 1. Is it necessary to use a webservice to update every so often the time zones in the regions? Does anyone know of a web service that can serve? 2. Is there a better approach to solve this problem? Thanks in advance. UPDATE I will give an example to clarify what I'm trying to get. In the table providers I found this records: idproviders | starttime | endtime | region_id ------------|-----------|----------|----------- 1 | 03:00:00 | 17:00:00 | 23 (Texas) 2 | 04:00:00 | 18:00:00 | 23 (Texas) If I execute the query in January, with this information: Server time (UTC offset) = 0 hours Texas providers (UTC offset) = +1 hour Server time = 02:00:00 I should get the following results: idproviders = 1 If I execute the query in June, with this information: Server time (UTC offset) = 0 hours Texas providers (UTC offset) = +2 hours (their local time has not changed, but their time zone has changed) Server time = 02:00:00 I should get the following results: idproviders = 1 and 2

    Read the article

  • Is there way to enable 4 GB RAM in 32-bit Windows OS?

    - by Wahid Bitar
    I upgraded my PC to 4 GB RAM and I get only 3 GB. Windows 7 32-Bit consider that I've 4 GB RAM but didn't use more than 3 GB. Someone told me that MS Windows 32-bit doesn't support RAM larger than 3 GB. So please is there any way to make my OS "Windows 7 32-Bit" support more than 3 GB RAM ? *`Note: I can't move to 64-bit because I've many program doesn't work with a 64-bit OS. Edit:: I tried what Mr. Wonsungi advised me but whenever I check this option: Enable support for 4 GB of RAM I get the following error: 'Cannot access to the registry key HKEY_CLASSES_ROOT\CLSID\{E88DCCE0-11d1-A9F0-00AA0060FA31}.' There is no "CLSID" in my registry, I don't know why!.

    Read the article

  • Does Windows notice when a VM is moved around?

    - by Martin
    I'm thinking of migrating a Desktop machine (Windows XP) to a VM solution (VirtualBox or MS Virtual PC). The reason is that I need a new hardware anyways and I don't want to (cannot properly) reinstall all the "business" apps on there. So my plan goes as follows: I'll pull an image of the machine and restore it to a Virtual Machine using Acronis Universal Restore or some other tool that can restore to dissimilar hardware. (The process is largely irrelevant for this question I think.) Once I have this virtual machine properly running I'll move it to a new PC. So the question now is. Are there any caveats wrt. to Windows (XP?) being installed in a VM and the VM machine being moved around on different host computers? Can anything break in the OS inside the VM? Will there be troubles wrt. to Windows activation?

    Read the article

  • Windows XP - non-user input data filter message after installing wireless keyboard & mouse

    - by James
    After I installed MS wireless keyboard and mouse and associated software, I started getting this annoying message titled "Hardware installation" telling me the software I am trying to install did not pass the XP logo tests. The software is for "HID non-user input data filter" and I have two options Continue anyway or stop installation. Now, if I try to continue the installation fails, if stop installing another message pops up with a little mouse logo and the whole process repeats itself. after I am done with that message a third dialog appears. This is happening every time I boot up my PC (a desktop), I tried following an advice I found in some forum and download windows update for ID non-user input data filter, but that installation failed as well. The thing is, that both keyboard and mouse are working fine Is there anyway to get past these dialogs ?

    Read the article

  • Phantom Local Disks appearing in my drive list

    - by Paul
    I seem to have several phantom Local Disks mapped to different letters that are of 0 bytes in size. Strangely, they do not show up when I view my drives through Windows Explorer. But if I open an application such as ACDSee Pro or MS Word and then go to open a file I can see all these Local Disks mapped to different letters. This means when I plug in my external hard disk it ends up mapped to letter R instead of its usual G which messes up any programs I have pointing to it by default. How did they get there and more importantly, how do I get rid of them? I'm on a Window 7 Home Premium 32 bit machine.

    Read the article

  • sql server mdf file database attachment

    - by jnsohnumr
    hello all i'm having a bear of a time getting visual studio 2010 (ultimate i think) to properly attach to my database. it was moved from original spot to #MYAPP#/#MYAPP#.Web/App_Data/#MDF_FILE#.mdf. I have three instances of sql server running on this machine. i have tried to replace the old mdf file with my new one and cannot get the connectionstring right for it. what i'm really wanting to do is to just open some DB instance, run a DB create script. Then I can have a DB that was generated via my edmx (generate database from model) in silverlight business application (c#) right now, when i go to server explorer in VS, choose add new connection, choose MS SQL Server Database FIle (SqlClient), choose my file location (app_data directory), use windows authentication, and hit the Test Connection button I get the following error: Unable to open the physical file "". Operating system error 5: "5(Access Denied.)". An attempt to attach to an auto-named database for file"" failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share. The mdf file was created on the same machine by connecting to (local) on the sql server management studio, getting a new query, pasting in the SQL from the generated ddl file, adding a CREATE DATABASE [NcrCarDatabase]; GO before the pasted SQL, and executing the query. I then disconnected from the DB in management studio, closed management studio, navigated to the DATA directory for that instance, and copying the mdf and ldf files to my application's app_data folder. I am then trying to connect to the same file inside visual studio. I hope that gives more clarity to my problems :). Connection string is: Data Source=.\SQLEXPRESS;AttachDbFilename=C:\SourceCode\NcrCarDatabase\NcrCarDatabase.Web\App_Data\NcrCarDatabase.mdf;Integrated Security=True;Connect Timeout=30;User Instance=True

    Read the article

  • Exit code 3 (not my return value, looking for source)

    - by Kathoz
    Greetings, my program exits with the code 3. No error messages, no exceptions, and the exit is not initiated by my code. The problem occurs when I am trying to read extremely long integer values from a text file (the text file is present and correctly opened, with successful prior reading). I am using very large amounts of memory (in fact, I think that this might be the cause, as I am nearly sure I go over the 2GB per process memory limit). I am also using the GMP (or, rather, MPIR) library to multiply bignums. I am fairly sure that this is not a file I/O problem as I got the same error code on a previous program version that was fully in-memory. System: MS Visual Studio 2008 MS Windows Vista Home Premium x86 MPIR 2.1.0 rc2 4GB RAM Where might this error code originate from? EDIT: this is the procedure that exits with the code void condenseBinSplitFile(const char *sourceFilename, int partCount){ //condense results file into final P and Q std::string tempFilename; std::string inputFilename(sourceFilename); std::string outputFilename(BIN_SPLIT_FILENAME_DATA2); mpz_class *P = new mpz_class(0); mpz_class *Q = new mpz_class(0); mpz_class *PP = new mpz_class(0); mpz_class *QQ = new mpz_class(0); FILE *sourceFile; FILE *resultFile; fpos_t oldPos; int swapCount = 0; while (partCount > 1){ std::cout << partCount << std::endl; sourceFile = fopen(inputFilename.c_str(), "r"); resultFile = fopen(outputFilename.c_str(), "w"); for (int i=0; i<partCount/2; i++){ //Multiplication order: //Get Q, skip P //Get QQ, mul Q and QQ, print Q, delete Q //Jump back to P, get P //Mul P and QQ, delete QQ //Skip QQ, get PP //Mul P and PP, delete P and PP //Get Q, skip P mpz_inp_str(Q->get_mpz_t(), sourceFile, CALC_BASE); fgetpos(sourceFile, &oldPos); skipLine(sourceFile); skipLine(sourceFile); //Get QQ, mul Q and QQ, print Q, delete Q mpz_inp_str(QQ->get_mpz_t(), sourceFile, CALC_BASE); (*Q) *= (*QQ); mpz_out_str(resultFile, CALC_BASE, Q->get_mpz_t()); fputc('\n', resultFile); (*Q) = 0; //Jump back to P, get P fsetpos(sourceFile, &oldPos); mpz_inp_str(P->get_mpz_t(), sourceFile, CALC_BASE); //Mul P and QQ, delete QQ (*P) *= (*QQ); (*QQ) = 0; //Skip QQ, get PP skipLine(sourceFile); skipLine(sourceFile); mpz_inp_str(PP->get_mpz_t(), sourceFile, CALC_BASE); //Mul P and PP, delete PP, print P, delete P (*P) += (*PP); (*PP) = 0; mpz_out_str(resultFile, CALC_BASE, P->get_mpz_t()); fputc('\n', resultFile); (*P) = 0; } partCount /= 2; fclose(sourceFile); fclose(resultFile); //swap filenames tempFilename = inputFilename; inputFilename = outputFilename; outputFilename = tempFilename; swapCount++; } delete P; delete Q; delete PP; delete QQ; remove(BIN_SPLIT_FILENAME_RESULTS); if (swapCount%2 == 0) rename(sourceFilename, BIN_SPLIT_FILENAME_RESULTS); else rename(BIN_SPLIT_FILENAME_DATA2, BIN_SPLIT_FILENAME_RESULTS); }

    Read the article

  • How to stop Word 2011 opening hyperlinks on click?

    - by John Yeates
    In previous versions of MS Word, there was a preference for the action to be taken when the user clicked a hyperlink: open it, or edit it. Word 2011 appears to have defaulted to opening the hyperlink, and I can't find the preference to change this behaviour. How can I change Word's default behaviour when a hyperlink is clicked to be editing the text of the hyperlink? Holding down a modifier key when clicking is not an acceptable solution, as the aim here is to prevent misclicks from causing web pages to open. Edit: the links need to stay as links in the saved document. But when clicked on my machine, they should not open; Word needs to default to just editing the link, so an inaccurate click does not take me out of the document into Safari. Older versions of Word had a preference controlling this, and Microsoft seem to have removed it and fixed the behaviour at the unsafe option in order to satisfy the point-and-drool crowd.

    Read the article

  • Do I need to conver the older Access Database, and, if so, how?

    - by octopusgrabbus
    I have an Access 2003 database. When I click on a pivot table, I get this message MS Access There isn't enough memory to complete the Automation object operation on the worksheet object. There is a lot of discussion concerning this message. Here is one link. http://community.spiceworks.com/topic/113228-access-2003-file-pivot-table-issue-when-opening-in-access-2010 But this particular link's explanation doesn't really go into fixing the problem in general, like fixing the pivot tables and getting things all nicely back together in the original Access database. That's why I am also interested in converting the database to 2010 format if that is possible. Are there instructions -- I cannot currently find them and would very much appreciate a link -- on dealing with this problem in a nice stepwise fashion?

    Read the article

< Previous Page | 553 554 555 556 557 558 559 560 561 562 563 564  | Next Page >