Search Results

Search found 5287 results on 212 pages for 'physical computing'.

Page 112/212 | < Previous Page | 108 109 110 111 112 113 114 115 116 117 118 119  | Next Page >

  • I run a command twice - I'm wondering if it'll be a problem

    - by Delirium tremens
    "In computing, tee is a command in various command-line interpreters (shells) such as Unix shells, 4DOS/4NT and Windows PowerShell, which displays or pipes the output of a command and copies it into a file or a variable. It is primarily used in conjunction with pipes and filters." I run echo FRAMEBUFFER=y | sudo tee /etc/initramfs-tools/conf.d/splash twice. I opened the conf.d folder in Nautilus, but there isn't a splash file nor directory. I expected a file to be there with FRAMEBUFFER=y inside but there isn't. Is this going to be a problem?

    Read the article

  • Memory upgrade - is the site reliable?

    - by Yuval
    Hi, I have a late-2008 unibody macbook model with 2 GB of ram. I am looking to upgrade to 4 GB of ram. I looked about a month ago at Other World Computing's 4 GB upgrade kit and I remember it being around $80. I looked today, finally getting to buy it and it went up to almost $100. I found another site, memoryupgrade.pro that calls itself "Pro memory upgrade" and it looks legitimate - it sells the memory for around $80 in its own brand. The only thing is, I haven't been able to find any reviews about it, and I'm not sure if it actually is reliable. Does anybody have any experience with this site? Does anybody have any other suggestions for buying macbook memory? I have friends who bought from OWC and were happy, should I just spend the extra $30 (including shipping) and buy from them? Thanks!

    Read the article

  • Using the same VPC image on multiple workstations

    - by justSteve
    I haven't used an VMs before so brand new to the party. I'm running Win7 off an honest to goodness MSDN license so OS licensing is not an issue. I'd like to think that I could create a VPC image on a USB/eSATA hard drive and move that drive from one Win7-based workstation/laptop to another and I'd have the same services/desktop/computing environment at all of them. I'm a developer working against the IIS7/SQL08 stack with VS10 so i'm working with apps and services as deeply embedded with the OS as you can get. Should I expect to be able to pull this off? thx

    Read the article

  • How to host a scalable social networking app

    - by christopher-mccann
    I am in the middle of developing a social networking application for a very select user niche which could scale to a few million users. Right now I have always hosted applications on RackSpace Cloud and I have no issues with them at all - always been a really good service and never had any downtime. My question is though does anyone think that cloud computing is not the way to host scalable web apps? Or can anyone with experience of this recommend a better solution. I have always shunned trying to run big servers from my own facilities as I think it seems silly to go to the expense of bringing in big alternative power supplies and all the other necessary precautions when other companies already do this. I looked at managed hosting services but this proved to be a bit too expensive for us at the start and the scalability of it wasnt good enough - it would take a day or two to get a new server provisioned. Therefore I ended up on a cloud platform. If anyone has any recommendations or advice it would be greatly appreciated.

    Read the article

  • 912 stream processor available in OpenCL

    - by tugrul büyükisik
    I am thinking of assembling this system: AMD CPU (A8-3870 APU which has Radeon HD 6550D inside: 400 stream processors:xxx GFLOPS) nearly 110$ AMD Graphics card: HD 7750 (512 stream processors:819 GFLOPS peak performance) nearly 170$ Appropriate ram (1600MHz bus) Mainboard What GFLOPS level can I reach as a stable mode with using OpenCL and similar programs? Can I use all 912 stream processors at the same time? I am not trying to do a VS question. I need to know what could be better for scientific computing (%75 of the time) and gaming (%25 of the time) because I have a low budget. With "scientific calculations" I mean fluid dynamics/solid state physics simulating; with games I mean those that need openCL and PhysX.

    Read the article

  • HPC Cluster planning workflow?

    - by Veronica
    After three days of intensive Google searching, I have not found any high-level workflow of how to build a low profile - cheap - computing cluster (we are not interested in HA yet). This is just a front-end plus a node for now. We want to start small with rockscluster, provide a web-based server for offering services, and then add nodes as our budget increases. We're small company, so we haven't enough human resources to implement it smoothly. Here are some facts about our environment: Our hardware is not constant (we will add nodes). Our workload will vary (in the order from 200Mb - 1Tb) Our software will change (scientific applications for data mining) Do you know any visual workflow, worksheet, chart, describing the general necessary steps to begin our cluster planning?

    Read the article

  • Virtualization deployment for datacenter

    - by bogha
    Hi, my company is going to deploy an IT Infrastructure on a virtual platform, can you please help me with the following: 1- which one do you recommend, Cisco Unified computing system ( cisco + emc + vmware )or HP Blades( virtualization solution + HP Storage )? 2- i Need to install a DNS Server, Web server, cpanel for managing hosting packages and Microsoft layer of product for usingg in the corporate infrastructur ( active directory, Local DNS, Exchange server, DHCP, Global catalog ) what is the minimum requirments for these servers ( in terms of CPU and Memory ) . 3- what is the best way to implement a redundant solution in a virtual environment. thank you

    Read the article

  • Multiple servers acting like a single one with all the hardware?

    - by marc.riera
    Hello, by now I have 10 servers for hpc, power computing oriented. My users need to launch several processes using qmake. The users are used to work with ubuntu 9.10, and the software from the repositories is switable for them. I've deployed ubuntu 9.10 to all 10 servers (pxe rocks). By now we work with parallel-ssh and cluster-ssh, which allows as to launch the same process to all servers. With this tools this tools the servers remain as independent but with the same software and the same launched command. Now we would like to go to next step and see all the servers as a single one with all the resources from the other 9 as if was its resources. The difference would be substantial in time to process and also time to design the command to launch. Any advice on wich software to use will be very useful? Thanks

    Read the article

  • Azure Linux Virtual Machines Price per Hour, Computation or Running?

    - by Arjun Bajaj
    first of all, I couldn't find a StackExchange site on Cloud Computing. I think this is the most appropriate site, because some of you might be using Azure. So I just wanted to know: The Windows Azure Pricing Page shows Linux Virtual Machine Price as $0.013/hr for an extra small VM. The monthly price comes up to about $10. Is this price charged as number of hours of computation done on the VM or number of hours of running the VM? And if I shutdown the VM, will I be charged anything?

    Read the article

  • LinkSys WRT54GL + AM200 in half-bridge mode - Setup guide recommendations?

    - by Peter Mounce
    I am basically looking for a good guide on how to set up my home network with this set of hardware. I need: Dynamic DNS Firewall + port-forwarding VPN Wake-on-LAN from outside firewall VOIP would be nice QoS would be nice (make torrents take lower priority to other services when those other services are happening) DHCP Wireless + WPA2 security Ability to play multiplayer computer games I am not a networking or computing neophyte, but the last time I messed with network gear was a few years ago, so am needing to dust off knowledge I kinda half have. I have read that I should be wanting to set up the AM200 in half-bridge mode, so that the WRT54GL gets the WAN IP - this sounds like a good idea, but I'd still like to be advised. I have read that the dd-wrt firmware will meet my needs (though I gather I'll need the vpn-specific build, which appears to preclude supporting VOIP), but I'm not wedded to using it. My ISP supplies me with: a block of 8 static IPs, of which 5 are usable to me a PPPoA ADSL2+ connection

    Read the article

  • LinkSys WRT54GL + AM200 in half-bridge mode - UK setup guide recommendations?

    - by Peter Mounce
    I am basically looking for a good guide on how to set up my home network with this set of hardware. I need: Dynamic DNS Firewall + port-forwarding VPN Wake-on-LAN from outside firewall VOIP would be nice QoS would be nice (make torrents take lower priority to other services when those other services are happening) DHCP Wireless + WPA2 security Ability to play multiplayer computer games I am not a networking or computing neophyte, but the last time I messed with network gear was a few years ago, so am needing to dust off knowledge I kinda half have. I have read that I should be wanting to set up the AM200 in half-bridge mode, so that the WRT54GL gets the WAN IP - this sounds like a good idea, but I'd still like to be advised. I have read that the dd-wrt firmware will meet my needs (though I gather I'll need the vpn-specific build, which appears to preclude supporting VOIP), but I'm not wedded to using it. I live in the UK and my ISP supplies me with: a block of 8 static IPs, of which 5 are usable to me a PPPoA ADSL2+ connection

    Read the article

  • LinkSys WRT54GL + AM200 in half-bridge mode - UK setup guide recommendations?

    - by Peter Mounce
    Crossposted from here I am basically looking for a good guide on how to set up my home network with this set of hardware. I need: Dynamic DNS Firewall + port-forwarding VPN Wake-on-LAN from outside firewall VOIP would be nice QoS would be nice (make torrents take lower priority to other services when those other services are happening) DHCP Wireless + WPA2 security Ability to play multiplayer computer games I am not a networking or computing neophyte, but the last time I messed with network gear was a few years ago, so am needing to dust off knowledge I kinda half have. I have read that I should be wanting to set up the AM200 in half-bridge mode, so that the WRT54GL gets the WAN IP - this sounds like a good idea, but I'd still like to be advised. I have read that the dd-wrt firmware will meet my needs (though I gather I'll need the vpn-specific build, which appears to preclude supporting VOIP), but I'm not wedded to using it. I live in the UK and my ISP supplies me with: a block of 8 static IPs, of which 5 are usable to me a PPPoA ADSL2+ connection

    Read the article

  • MySql calculate number of connections needed

    - by Udi I
    I am trying to figure my needs regarding web service hosting. After trying Azure I have realized that the default MySql they provide (through a third party) limits the account to 4 connections. You can then upgrade the account to 15, 30 or 40 connections (which is quite expensive). Their 15 connections plan is descirbed as: "Excellent choice for light test and staging apps that need a reliable MySQL database". I have 2 questions: if my application is a web service which needs to preform ~120k Queries a day (Normal/BELL distribution) and each query is ~150ms(duration)/~400ms(fetch), how many connection do I need? If instead of using cloud computing, I will choose a VPS, how many connections will I be able to handle on a 1GB 2 cores VPS? Thank you!

    Read the article

  • What happens when you add a Graphics card to a i7 with build in graphics (e.g. HD 4000)

    - by Matt
    I'm thinking of upgrading my computer from an AMD Phenom II X4 955BE with a AMD Radeon HD 6800 Graphics card (not integrated) to using a Intel Core i7 3770. As I have no knowledge of integrated graphics, my question is, what happens to the computing power when not using the HD4000 integrated graphics of the CPU (does it mean the CPU will run faster then it would if I relied on it?) Also what is better the CPUs inbuilt HD4000 or my Radeon Graphics card? I am mostly interested in terms of content creation: Using Adobe After Effects, 3D Rendering etc. Not too bothered about gaming performance. I will be using the spare parts of this build and older systems to make a second computer for network renders so what will be the advantages of keeping the Radeon with the current system for that?

    Read the article

  • Staggering java linux process startup to prevent OOM

    - by ctennis
    I am running a number of java processes on a single Linux machine. From a memory and computing standpoint, everything is fine when things are static. However, periodically we use a configuration management package up upgrade the jar or war files, and restart the java process. The problem is, that is restarts them all relatively quickly, and so we get 10 or so java VMs restarting all at the same time (we use daemontools for the service stops/starts), which wreaks havoc on the machine, in terms of OOMs or just really slow. This is because it's spawning the JVM 10x at the same time. Other than trying to stagger the startups, is there a smarter way of handling this? Maybe a sysctl tuning performance parameters, or a JVM parameter?

    Read the article

  • How to manage several Linux workstation like a cluster?

    - by Richard Zak
    How does one go about managing a lab of Linux workstations? I'd like for users to be able to log in, run their GUI apps (LibreOffice, Firefox, Eclipse, etc), and for the computers to be able to be used as compute nodes (OpenMPI). This part I'm fine with. But how can I centrally deploy a new software package or upgrade an installed package? How can I reload the entire OS on a given node, as if these workstations were part of a super computing cluster? Is there a nice program to help with setting up PXE booting and image management, and remotely managing packages? Ideally such a system would work with Ubuntu. If there isn't a nice package, how could this be set up manually?

    Read the article

  • Using PHP to connect to RADIUS works on one server but not another

    - by JDS
    I have a fleet of webservers that server a LAMP webapp broken into multiple customer apps by virtualhost/domain. The platform is Ubuntu 10.04 VM + PHP 5.3 + Apache 2.2.14, on top of VMware ESX (v4 I think). This stuff's not too important, though -- I'm just setting up the background. I have one customer that connects to a RADIUS server for authentication. We've found that the app responds as if some number of web servers are configured correctly and some are not. i.e. Apparently random authentication failures or successes, with no rhyme or reason. I did a lot of analysis of our fleet, and resolved it down to the differences between two specific web servers. I'll call them "A" and "B". "A" works. "B" does not. "Works" means "connects to and gets authentication data successfully from the RADIUS server". Ultimately, I'm looking for one thing that is different, and I've exhausted everything that I can come up with, so, looking for something else. Here are things I've looked at PHP package versions (all from Ubuntu repos). These are exactly the same across servers. PECL package. There are no PECL packages that aren't installed by apt. Other libraries or packages. Nothing that was network-related or RADIUS-related was different among servers. (There were some minor package differences, though.) Network or hosting environment. I found that some of the working servers were on the same physical environment as some not-working ones (i.e. same ESX containers). So, probably, the physical network layer is not the problem. Test case. I created a test case as follows. It works on the working servers, and fails on the not-working servers, very consistently. <?php $radius = radius_auth_open(); $username = 'theusername'; $password = 'thepassword'; $hostname = '12.34.56.78'; $radius_secret = '39wmmvxghg'; if (! radius_add_server($radius,$hostname,0,$radius_secret,5,3)) { die('Radius Error 1: ' . radius_strerror($radius) . "\n"); } if (! radius_create_request($radius,RADIUS_ACCESS_REQUEST)) { die('Radius Error 2: ' . radius_strerror($radius) . "\n"); } radius_put_attr($radius,RADIUS_USER_NAME,$username); radius_put_attr($radius,RADIUS_USER_PASSWORD,$password); switch (radius_send_request($radius)) { case RADIUS_ACCESS_ACCEPT: echo 'GOOD LOGIN'; break; case RADIUS_ACCESS_REJECT: echo 'BAD LOGIN'; break; case RADIUS_ACCESS_CHALLENGE: echo 'CHALLENGE REQUESTED'; break; default: die('Radius Error 3: ' . radius_strerror($radius) . "\n"); } ?>

    Read the article

  • Can spliting an access database cause printer and reporting issues?

    - by leeand00
    We have a setup in which our users log into an access database using MS Access 2003 over an RDP connection. The user's login to their own machines first using a roaming profile. They then click an rdp connection file on the desktop and login to the remote server, via RDP, where they use MS Access as the shell; they don't have any access to any of explorer.exe features such as the start menu. The database they are logging into is more of an application, and provides functionality for entering data, querying data, and running reports via form based menus. It all worked pretty well until we split the database as it was nearing 2GBs in size. We moved out the payroll data into a separate partition, a database with the same name in a different folder, both of them on the server. Only two tables were moved into this new database partition, and they were re-linked as external tables in the new partition. Now while everything appears to be working fine data-wise after the split, there's a new issue when our users login via RDP and attempt to run reports: often the report will not display and instead the user sees an error about the click event of the form. At first I didn't even know it was printer-related, as we didn't really change anything related to the printers as far as I knew. Confused about the error, I talked to the guy who previously worked here and who was in charge of splitting the database, and he told me to tell the users to set their default printers (on their local machines, not on the server) to the "printer" Microsoft XPS Document Writer which isn't a physical printer at all. This allowed the user's to display their reports, but if they want to print out reports, they are required to go to the File menu and select Print, clicking the print icon on the toolbar takes them to a Save As... dialog as would be expected when using the Microsoft XPS Document Writer as your default printer. It's easy to tell if the user is having a problem because a quick mouseover of the printer icon will yield a tooltip of (none) when they cannot access their reports, and a tooltip of Microsoft XPS Document Writer when they can view the reports. If the user's printer is set to anything other than Microsoft XPS Document Writer as the default on their local machine, then (none) is always displayed when they rdp to the database. The RDP settings are setup to transfer the local printer to the server. Telling the users to do this to print has been more of a band-aid on the whole situation until we find a better solution and an explanation as to why splitting a database would prevent users from printing or even viewing access database reports. Which is why I'm here asking this question. Also of note all the printers on the network now show up on the server so that when the users do click File->Print to print their reports on a physical printer, they have to look through a huge list of printers to find theirs in the dropdown. So the little band-aid fix we have is not ideal. Previously, only the printers on the user's local machine displayed here, and not all the printers on the network. My co-worker seems to think this has something to do with permissions, I personally think it has to do with roaming profiles, and Group Policies which is what I've been reading up on. I really don't know how to fix this or how it is related to splitting the database.

    Read the article

  • Windows 7 disk errors after a few hours of runtime

    - by GFK
    I'm having trouble understanding what is going on with my work PC. Whenever I boot it, it runs fine for a while, then starts to randomly show disk errors. The displayed error often contains the message "not enough storage is available to process this command", although depending on the application that fails it can be different. This has happened for weeks now and is getting worse. This is what troubles me: It never seems to impact critical parts of the system (no BSOD, no freeze). Only some applications seem impacted, refusing to function correctly after a while: Outlook 2010 cannot download RSS feeds anymore, Firefox 6 or IE9 cannot download anything bigger than 3MB without failing, Windows Update fails, all msi installers fail, Visual Studio 2010 starts failing in weird manners... It only happens after a while using it (typically 3 hours, but it seems that installing a program or compiling several times makes it shorter) Rebooting solves it (temporarily). The system: The OS is Windows 7 Pro Spanish SP1, 32 bits The system is an HP Compaq 6000 Pro with 4 GB memory (only 3.4GB usable since the system is 32bit), one 500GB hard drive. Installed applications include: Visual Studio 2010, SQL Server 2008 R2, VMWare Workstation 7, Microsoft Security Essentials, Office 2010. Shutting down all related services and processes doesn't seem to change anything. The diagnostics I've run so far: Hard drive : 465GB, 165GB free Process Explorer : physical and virtual memory seem ok (pagefile is 5.3GB, physical memory usage 70%, system commit 39%) Windows Memory diagnostic tool: OK CHKDSK returned: 488282111 KB total disk space. 281668248 KB in 265779 files. 150188 KB in 62949 indexes. 0 KB in bad sectors. 571755 KB in use by the system. The log file has occupied 65536 kilobytes. 205891920 KB available on disk. For non-spanish speakers, that means all ok. SMART diagnostic tools (DiskCheckup) report all values normal. temperatures are in the normal range (HWinfo). The event viewer doesn't seem to contain any significant message. ran CCleaner 3, without any noticeable effect. I was thinking about some file number limit (between Visual Studio projects and other applications, there are around 300.000 files on the hard drive), but I couldn't find any. It's possible there is something related with the use of the temporary folders (it's the only explanation I have for why applications fail but Windows doesn't), but I cannot confirm that. Only thing I cannot find out is if chkdsk reporting 65MB for the log is normal. It seems since Vista it always reports this. Any other cleaning/diagnostic tool you might know of? Edit: I ran several other tools since I first published the question: Seagate SeaTools (the HD manufacturer's analysis tool): complete test run OK. Intel Rapid 10.1 (the HD controller manufacturer's troubleshooting tool): the HD's ok. Microsoft Desktop Heap Monitor: Desktop Heap Information Monitor Tool (Version 8.1.2925.0) Copyright (c) Microsoft Corporation. All rights reserved. Session ID: 1 Total Desktop: ( 46464 KB - 11 desktops) WinStation\Desktop Heap Size(KB) Used Rate(%) WinSta0\Winlogon (s1) 128 3.6 WinSta0\Disconnect (s1) 64 3.8 WinSta0\Default (s1) 20480 3.0 msswindowstation\mssrestricteddesk (s0) 1024 0.2 __X78B95_89_IW__A8D9S1_42_ID (s0) 1024 0.2 Service-0x0-3e5$\Default (s0) 1024 0.6 Service-0x0-3e4$\Default (s0) 1024 0.3 Service-0x0-3e7$\Default (s0) 1024 2.1 WinSta0\Winlogon (s0) 128 1.9 WinSta0\Disconnect (s0) 64 3.8 WinSta0\Default (s0) 20480 0.0 All ok, desktop heap usage < 5% Edit 2: I tried totally resetting my account by creating a new one, logging under this new one and delete the first one (local rights and files), then logging back with this deleted account (it is a domain account). No luck. Also, I found out often the error is "not enough storage is available to process this command". Searching on the internet, I found an old troubleshooting tip (setting a registry key to raise the IRP stack limit, whatever it is) which did not change anything.

    Read the article

  • How can I copy files to an external drive and verify their integrity in OS X?

    - by jedavis
    I'm moving large amounts of data from one external drive to another larger one. The files are important and the smaller drives need to be cleared and reused (HD camera). Is there some utility for moving files and verifying their integrity? I've been using this command find . -type f -exec md5 '{}' \; > md5list.txt in the terminal to create a list of MD5s for each file then using diff to compare the two. However, I am moving 320GB at a time, which takes a while by itself. Computing the checksums takes another hour or so. It would be much more efficient to do this on the fly, during the copy. I'm just hoping someone has already written the software...

    Read the article

  • Netbook recomendations for a developer

    - by Joe
    I am thinking about getting a netbook for a secondary laptop. Ideally it would mainly be used for surfing/email/travel, but I would like it to be good enough to be able to run Visual Studio for when I am at conferences and the like. I as thinking it would be nice to be able to put a 16-32GB SSD in it, as well as 2GB of memory. Do you have any recomendations? Will a netbook even suffice, or should I upgrade to a small-form laptop? Edit: I don't need to be able to build software on it. It would just be nice to occasionally be able to try out new tools, APIs, or what have you without getting frustrated due to limited computing power.

    Read the article

  • Windows 7 root certificate updates

    - by hstr
    I work for a company that uses Windows 7 for end user computing. The Windows 7 computers are updated via a WSUS installation, and access to Microsoft Update is blocked. We have a problem with a number of websites, who's certificates appears to be invalid, though they are perfectly ok. The problem is, that Windows 7 apparently does an on-demand update of root certificates through Windows Update, rather than rolling out a monthly update, as with Windows XP. Now that Windows Update is blocked, how should root certificates be updated? It appears that WSUS is not handling this feature. Thanks in advance.

    Read the article

  • Can I rent exclusive time on a powerful server running linux? [closed]

    - by Mark Borgerding
    My company is involved in a proposal that requires speed estimates of our software on a server with the latest & greatest processors. This is not the first time we've been in this situation. The servers themselves are too expensive to buy a new one every time, so we end up extrapolating from what we have. There are so many variables: processor generation & speed, memory speed, memory channels, cache configurations; it makes extrapolation difficult and error-prone. Is there a business that rents time on the newest servers? At least part of the time we'd need exclusive access to an otherwise quiescent system either via ssh shell access or unattended batch jobs. I am not looking for general cloud computing services. I don't need much time on the server, but it needs to be exclusive. And the server needs to be pretty cutting edge for a solid basis of estimate.

    Read the article

  • Mac OS X multi-user thin client server (terminal server)?

    - by username
    Is there any solution out there to turn a Mac into a true multi-user thin client server? I'd like to set up a few cheap PCs with access to a couple accounts using something like VNC, but it isn't economical to buy a new server for each user or a new license for virtualized OS X Server for each user. I'm fully aware that OS X Server lets you set up users with "network home folders," and I know there's also VNC built into Mac OS X. Neither of these fit the bill (the former requires a thick client, and the latter is single-user only) UPDATE: yay, Lion! http://www.9to5mac.com/54102/10-7-lion-allows-multi-user-remote-computing

    Read the article

  • suddenly can't connect to router

    - by Khoi
    I was just downloading some stuff in ubuntu and snap, the connection cut and I can't even connect to my router. And the router, it still works fine, my laptop can connect wirelessly to it as usual. But my main computer (which connects to it directly through cable) can't even ping it. Here is my ipconfig: Windows IP Configuration Host Name . . . . . . . . . . . . : vento Primary Dns Suffix . . . . . . . : Node Type . . . . . . . . . . . . : Unknown IP Routing Enabled. . . . . . . . : No WINS Proxy Enabled. . . . . . . . : No Ethernet adapter Local Area Connection: Media State . . . . . . . . . . . : Media disconnected Description . . . . . . . . . . . : Realtek RTL8169/8110 Family Gigabit Ethernet NIC Physical Address. . . . . . . . . : 00-19-DB-4E-6C-56 Ethernet adapter {15B1F740-2F35-4FE4-9FEE-4052AFBAD096}: Media State . . . . . . . . . . . : Media disconnected Description . . . . . . . . . . . : Anchorfree HSS Adapter - Packet Sche duler Miniport Physical Address. . . . . . . . . : 00-FF-15-B1-F7-40

    Read the article

< Previous Page | 108 109 110 111 112 113 114 115 116 117 118 119  | Next Page >